Integration Tests with Spring Boot

12.03.2018
Thomas Körner

Spring Boot provides great support for testing controllers via WebMvcTest which allows calling controllers directly via the MockMvc utility. The developer can mock corresponding service and repository calls and verify the service orchestration within the controller as well as the appropriate setup of JSON responses. However, this kind of test does not run the complete chain of registered HTTP filters, resolvers etc.

In order to test an incoming REST call in all layers of the Spring Boot application, the integration test should target the application the same way as any external client would run a call. The schema depicted below shows a simple REST service orchestration using an external authentication service via spring security to secure the API routes. The application under test allows certain API routes for all authenticated users but specific routes just for admin authorities.

integration test illustration

The integration test is executing a REST call toward the controller under test. The REST call includes a Bearer token which authenticates the user, issuing the call. Using an authentication filter, spring security runs a validation call toward an Auth API in order to validate the given token. The Auth API returns the authorities (roles) of the authenticated user. Based on the returned authorities spring security is going to allow or deny the specific API route access.

Challenges

Integration tests should run for each pull request in order to verify that changes do not break the system. The build infrastructure is going to run PR builds parallel in order to provide timely feedback for each developer. Therefore, resources as persistence stores need to be tenant aware to avoid that tests interfere in test data setup and verification procedures.

Note: Another approach would be using embedded databases per test which brings isolation out of the box. The issue with embedded databases is, that migrations might not be transferrable 1:1 due to different sql syntax. Thus, one would need separate tests for the migrations and might not utilise database constraint checks etc.

Single Test Run

Each integration test will run along the following chain

Mock of authentication users

Spring Boot provides the RestTemplateFactory which allows to enhance calls done via the RestTemplate transparently. The sample method below enhances each request with the HTTP Authorization header and adds the Bearer token used in common JWT token APIs.

  public static RestTemplateBuilder bearerAuthTemplate(String token,
      Optional<Integer> port) {
    return baseTemplate(port).additionalInterceptors(
        (httpRequest, bytes, clientHttpRequestExecution) -> {
          httpRequest.getHeaders().add(HttpHeaders.AUTHORIZATION,
              "Bearer " + token);
          return clientHttpRequestExecution.execute(httpRequest, bytes);
        });
  }

Targeting the controller under test by using this rest template will always include the Bearer Token. All integration tests are using this TestRestTemplate which is configured to use a static token in each call. This allows easy mocking of all Auth API calls because the validate call will always use the static token.

  @Configuration
  public class TestRestTemplateConfig {

    @Value("${scope.storefront.it.auth:bearer}")
    private String auth;

    @Bean
    public RestTemplateBuilder restTemplateBuilder() {
      if ("basic".equals(auth)) {
        return RestTemplateFactory.basicAuthTemplate("user", "password", Optional.empty());
      }

      if ("bearer".equals(auth)) {
        return RestTemplateFactory.bearerAuthTemplate(
            AbstractIntegrationTestBase.BEARER_TOKEN, Optional.empty());
      }

      // no auth
      return RestTemplateFactory.noAuth(Optional.empty());
    }
  }

The authentication call toward the Auth API is caught by using the MockRestServiceServer. This server allows intecepting calls triggered by the TestRestTemplate. The listing below shows how to setup a user with expected authorities. Please be aware of the Bearer token, which is part of the validate request according to the applications authentication logic.

// bind the MockRestServiceServer to listen to restTemplate used to call controller under test
private void initMockRestServiceServer() {
  mockRestServiceServer = MockRestServiceServer.bindTo(restTemplate).ignoreExpectOrder(true).build();
}


// mock the auth service call an return the given authorities
private void mockAuthService(final String userId, final UserRole... roles) {
  mockSsoServer(MockRestResponseCreators
      .withSuccess("{\"userId\": \"" + userId + "\", \"roles\": ["
              + Arrays.stream(roles).map(r -> String.format("\"%s\"", r.name())).collect(
          Collectors.joining(","))
              + "]}",
          MediaType.APPLICATION_JSON_UTF8));
}

private void mockSsoServer(DefaultResponseCreator responseCreator) {
  mockRestServiceServer.reset();
  mockRestServiceServer.expect(manyTimes(),
      MockRestRequestMatchers.requestTo(
          "http://mock-sso:8080/validate?token=" + AbstractIntegrationTestBase.BEARER_TOKEN))
      .andExpect(MockRestRequestMatchers.method(HttpMethod.GET))
      .andRespond(responseCreator);
}

Setup of Database

Each integration test runs isolated in its own database schema. In order to allow automated set up as well as tear down of the test database the integration test is using two different database users. The provisioning database allows the creation of database schemas as well as the creation of users. The application database user is created by the provisioning user during the start of the spring boot application. Spring Boot provides an event mechanism which calls registered listeners during startup and shutdown. The ApplicationPreparedEvent is used to setup the database and store update the Spring datasource configuration with the application user.

Please find below the setup of database. The sql script for creating the database includes placeholders for the application user which is simply build by using the current system time in millis.

@Component 
public class ApplicationEnvironmentPreparedListener implements ApplicationListener<ApplicationPreparedEvent> {

  private static final Logger LOGGER = LoggerFactory.getLogger(ApplicationEnvironmentPreparedListener.class);

  @Override
  public void onApplicationEvent(ApplicationPreparedEvent applicationPreparedEvent) {
    applicationPreparedEvent.getApplicationContext().registerShutdownHook();

    setupDatabase(applicationPreparedEvent);
  }

  private void setupDatabase(ApplicationPreparedEvent applicationPreparedEvent) {
    ConfigurableEnvironment environment = applicationPreparedEvent.getApplicationContext().getEnvironment();
    String provisioningDbUrl = environment.getProperty("spring.provisioning.datasource.url");
    String provisioningDbUser = environment.getProperty("spring.provisioning.datasource.username");
    String provisioningDbPassword = environment.getProperty("spring.provisioning.datasource.password");
    String appDatabaseUrl = environment.getProperty("spring.datasource.url");
    String dbUser = getDatabaseUser();
    // this event is called twice for some reasons => avoid creating two schemas
    if (!hasDatabaseBeenCreatedInBuild()) {
      dbUser = runSql(getDataSource(provisioningDbUser, provisioningDbPassword, provisioningDbUrl));
    }
    //add user to spring boot environment
    Properties props = new Properties();
    props.put("spring.datasource.username", dbUser);
    props.put("spring.datasource.url", appDatabaseUrl + dbUser);
    environment.getPropertySources().addFirst(new PropertiesPropertySource("integrationTestProperties", props));
    LOGGER.info("integration tests run on database schema {}", dbUser);
  }

  private String getDatabaseUser() {
    if (hasDatabaseBeenCreatedInBuild()) {
      return generatedDbSchemaName();
    } else {
      return "db_integration_" + String.valueOf(System.currentTimeMillis());
    }
  }

  private String runSql(DataSource datasource) {
    String generatedUser = getDatabaseUser();
    Resource resource = new ClassPathResource("databaseInit.sql");
    // file needs to be executed as one script in order to allow the "use database" directives
    // loads the file, replaces the placeholder, writes into temp file for execution which is
    // deleted on exit
    try {
      String sqlCommands = ScriptUtils.readScript(new LineNumberReader(new InputStreamReader(resource.getInputStream())), "--",";");
      sqlCommands = sqlCommands.replaceAll("@userName@", generatedUser);
      File foo = File.createTempFile(generatedUser, "sql");
      foo.deleteOnExit();
      FileWriter writer = new FileWriter(foo);
      writer.write(sqlCommands);
      writer.flush();
      FileSystemResource tempResource = new FileSystemResource(foo);
      ScriptUtils.executeSqlScript(datasource.getConnection(), tempResource);
    } catch (Exception e) {
      LOGGER.error("cannot initialize database for some reasong", e);
      throw new RuntimeException(e);
    }
    System.setProperty("generated.db", generatedUser);
    return generatedUser;
  }

  private DataSource getDataSource(String user, String password, String url) {
    DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
    dataSourceBuilder.url(url);
    dataSourceBuilder.username(user);
    dataSourceBuilder.password(password);
    return dataSourceBuilder.build();
  }

  private boolean hasDatabaseBeenCreatedInBuild() {
    return generatedDbSchemaName() != null;
  }

  private String generatedDbSchemaName() {
    return System.getProperty("generated.db");
  }  
CREATE DATABASE [@userName@]
CONTAINMENT = NONE;
ALTER DATABASE [@userName@] SET ANSI_NULL_DEFAULT OFF
;
ALTER DATABASE [@userName@] SET ANSI_NULLS OFF
;
ALTER DATABASE [@userName@] SET ANSI_PADDING OFF
;
ALTER DATABASE [@userName@] SET ANSI_WARNINGS OFF
;
ALTER DATABASE [@userName@] SET ARITHABORT OFF
;
ALTER DATABASE [@userName@] SET AUTO_CLOSE OFF
;
ALTER DATABASE [@userName@] SET AUTO_SHRINK OFF
;
ALTER DATABASE [@userName@] SET AUTO_CREATE_STATISTICS ON(INCREMENTAL = OFF)
;
ALTER DATABASE [@userName@] SET AUTO_UPDATE_STATISTICS ON
;
ALTER DATABASE [@userName@] SET CURSOR_CLOSE_ON_COMMIT OFF
;
ALTER DATABASE [@userName@] SET CURSOR_DEFAULT  GLOBAL
;
ALTER DATABASE [@userName@] SET CONCAT_NULL_YIELDS_NULL OFF
;
ALTER DATABASE [@userName@] SET NUMERIC_ROUNDABORT OFF
;
ALTER DATABASE [@userName@] SET QUOTED_IDENTIFIER OFF
;
ALTER DATABASE [@userName@] SET RECURSIVE_TRIGGERS OFF
;
ALTER DATABASE [@userName@] SET  DISABLE_BROKER
;
ALTER DATABASE [@userName@] SET AUTO_UPDATE_STATISTICS_ASYNC OFF
;
ALTER DATABASE [@userName@] SET DATE_CORRELATION_OPTIMIZATION OFF
;
ALTER DATABASE [@userName@] SET PARAMETERIZATION SIMPLE
;
ALTER DATABASE [@userName@] SET READ_COMMITTED_SNAPSHOT OFF
;
ALTER DATABASE [@userName@] SET  READ_WRITE
;
ALTER DATABASE [@userName@] SET RECOVERY FULL
;
ALTER DATABASE [@userName@] SET  MULTI_USER
;
ALTER DATABASE [@userName@] SET PAGE_VERIFY CHECKSUM 
;
ALTER DATABASE [@userName@] SET TARGET_RECOVERY_TIME = 60 SECONDS
;
ALTER DATABASE [@userName@] SET DELAYED_DURABILITY = DISABLED
;
USE [@userName@]
;
ALTER DATABASE SCOPED CONFIGURATION SET MAXDOP = 0;
;
ALTER DATABASE SCOPED CONFIGURATION FOR SECONDARY SET MAXDOP = PRIMARY;
;
ALTER DATABASE SCOPED CONFIGURATION SET LEGACY_CARDINALITY_ESTIMATION = OFF;
;
ALTER DATABASE SCOPED CONFIGURATION FOR SECONDARY SET LEGACY_CARDINALITY_ESTIMATION = PRIMARY;
;
ALTER DATABASE SCOPED CONFIGURATION SET PARAMETER_SNIFFING = ON;
;
ALTER DATABASE SCOPED CONFIGURATION FOR SECONDARY SET PARAMETER_SNIFFING = PRIMARY;
;
ALTER DATABASE SCOPED CONFIGURATION SET QUERY_OPTIMIZER_HOTFIXES = OFF;
;
ALTER DATABASE SCOPED CONFIGURATION FOR SECONDARY SET QUERY_OPTIMIZER_HOTFIXES = PRIMARY;
;
USE [@userName@]
;
IF NOT EXISTS (SELECT name FROM sys.filegroups WHERE is_default=1 AND name = N'PRIMARY') ALTER DATABASE [@userName@] MODIFY FILEGROUP [PRIMARY] DEFAULT
;

CREATE LOGIN [@userName@] WITH PASSWORD=N'password', DEFAULT_DATABASE=[@userName@], CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF;
CREATE USER [@userName@] FOR LOGIN [@userName@];
ALTER ROLE [db_owner] ADD MEMBER [@userName@];

Run Test

At this stage, the integration test setup provides an empty database. Database migrations in order to setup the DDL structure as well as to seed data can be done by using Flyway. In addition, entities necessary to test business logic can be inserted via JDBC templates using plain sql commands or FactoryDuke.

The test runs by simply calling the REST controller under test via the pre-configured rest template.

@Test
public void getResource_ok() throws Exception {
  final ResponseEntity<JsonNode> response = restTemplate.getForEntity("/resources/4711"), JsonNode.class, userId);
  assertEquals(HttpStatus.OK, response.getStatusCode());
  ....
}

Tear Down Database

The ContextClosedEvent is used to drop the application user as well as to drop the test database. This is done similar to the ApplicationPreparedEvent. The provisioning user is utilitzed to stop all running sessions of the application user, drop the application user as well as the integration test database.

Credits for cover image go to THE CURIOUS DEVELOPER

Thomas Körner

Thomas has programmed, conceived and designed software for 16 years. His path into IT took a classic route from Basic on the C16, Turbo Pascal on the 286 and C++ on several generations of Pentium processors. He has now also acquired Java. Thomas works for kreuzwerker in various…

Read More ...
comments powered by Disqus