Modularizing with Micronaut Framework

In this blog post, I’m going to explain how you can make use of the Micronaut framework to organize your project code in such a way that you can externalize and modularize some of your Beans.
03.05.2019
Roman Tuchin
Tags

In this blog post, I’m going to explain how you can make use of the Micronaut framework to organize your project code in such a way that you can externalize and modularize some of your @Beans. These beans can be used as “common” and can be shared between several projects. The Micronaut framework is a relatively new kid on the block and tries to compete with such big boys as Spring . So let’s take a look at what it brings to the table.

So let’s say you’re given the task to develop a file transformer, which can apply certain transformations to input files and put the result into an output folder. A transformation can be e.g. converting files from one format to another, like CSV to parquet, merging files into bigger/smaller ones, etc. The transformer should be freely extensible, also by external implementations.

In order to pave the way for extensibility, let’s prepare a base project and one sample project with a concrete implementation.

Base Project

The whole tool will be prepared in the form of a command-line interface (CLI). I used the mn tool from Micronaut (see here how to install it) to initialize our CLI project.

$ mn create-cli-app de.kreuzwerker.transformer-cli
| Generating Java project...
| Application created at $(HOME)/externalize-beans-micronaut/transformer-cli

Now we’ve got a skeleton CLI project which can be run either from your favorite IDE or from a terminal. We’ll need to prepare the base code structure. This is a base Transformer class:

public abstract class Transformer {
   private final Logger logger = LoggerFactory.getLogger(this.getClass());

   public void startTransformation(List<Path> filesToProcess) {
       logger.info("Start transforming using the transformer {}.", this.getClass().getSimpleName());
       Path outputDirectory = outputDirectory();

       try {
           if (!Files.exists(outputDirectory)) {
               logger.info("Output directory doesn't exist. Creating it...");
               Files.createDirectories(outputDirectory);
           }

           transform(filesToProcess);
       } catch (IOException e) {
           logger.warn("IO problem during transforming.", e);
       }
       logger.info("Transforming finished.");
   }

   protected void transform(List<Path> filesToProcess) throws IOException {
       for (Path inputFilePath : filesToProcess) {
           logger.info("Processing input file: {}", inputFilePath);
           transform(inputFilePath);
           logger.info("Finished processing input file: {}", inputFilePath);
       }
   }

   protected abstract void transform(Path inputFile) throws IOException;

   protected abstract Path outputDirectory();
}

The class defines a few abstract methods, which will be implemented by a concrete extension. In this case, the concrete implementation has to define how to transform the file transform(Path inputFile) and where to put the result to, via outputDirectory()

You will use this class in the TransformerCliCommand by injecting a list of Transformer’s through the constructor. In order to enable dependency injections, you will annotate the TransformerCliCommand with @Singleton.

@Command(name = "transformer-cli", description = "...",
       mixinStandardHelpOptions = true)
@Singleton
public class TransformerCliCommand implements Runnable {
   private final Logger logger = LoggerFactory.getLogger(this.getClass());

   private final List<Transformer> transformers;

   public TransformerCliCommand(List<Transformer> transformers) {
       this.transformers = transformers;
   }

   public static void main(String[] args) throws Exception {
       PicocliRunner.run(TransformerCliCommand.class, args);
   }

   public void run() {
       transformers
          .forEach(transformer -> transformer.startTransformation(Collections.emptyList()));
   }
}

After these steps, we’re able to run the base project and see that no transformers are applied.

Our First Transformer

Now you will implement a first concrete transformer. For that sake, you create an empty gradle java project, i.e. using gradle init --type java-application Its build.gradle should look like:

plugins {
   id 'java'
   id "io.spring.dependency-management" version "1.0.6.RELEASE"
}

group 'de.kreuzwerker.transformer'
version '1.0-SNAPSHOT'

sourceCompatibility = 11

repositories {
   mavenCentral()
}

dependencyManagement {
   imports {
       mavenBom 'io.micronaut:micronaut-bom:1.0.3'
   }
}

dependencies {
   annotationProcessor "io.micronaut:micronaut-inject-java"
   compile files('libs/transformer-cli-0.1.jar')
   compile "io.micronaut:micronaut-inject"
}

First, we add the compile "io.micronaut:micronaut-inject" dependency to be able to use the @Singleton annotation for the dependency injection in the transformer implementation. You also add annotationProcessor "io.micronaut:micronaut-inject-java", which is required by Micronaut to generate helper classes for dependency injections at compile-time. The last thing is importing of the jar containing the base class Transformer. You build our base project with ./gradlew build and copy the resulting artifact jar into libs directory of the second project. As you see in the gradle.build above, you imported the artifact through compile files('libs/transformer-cli-0.1.jar'). Of course, you could also upload the built artifact into a maven repository and import it in the usual way as well.

After these preparations let’s create a basic implementation of the Transformer.

@Singleton
public class ParquetTransformer extends Transformer {
   @Override
   protected void transform(Path inputFile) throws IOException {
       //you do your transformation here
   }

   @Override
   protected Path outputDirectory() {
       return Path.of("/tmp/output-parquet");
   }
}

Getting Things Together

The parquet-transformer needs to be built and imported into our base project. You can do that, as described previously, either with a maven repository or with a local copy of the artifact.

Once you have imported it, you can start your base project and see the results of transformations.

$ java -jar build/libs/transformer-cli-0.1-all.jar
14:50:49.710 [main] INFO  i.m.context.env.DefaultEnvironment - 
Established active environments: [cli]
14:50:50.357 [main] INFO  d.k.transformer.ParquetTransformer - 
Start transforming using the transformer ParquetTransformer.
14:50:50.360 [main] INFO  d.k.transformer.ParquetTransformer - 
Transforming finished.

Conclusion

As you’ve seen, you can externalize custom beans using Micronaut with little effort. In general, the Micronaut framework makes a very good impression. It’s more lightweight, bootstraps faster than Spring Boot Framework. and has some killer features. One of them, dependency injections are resolved at compile time as opposed to the Spring Framework, where that happens at runtime.

Keep it up, micronaut team!