Thursday, January 4, 2018

Spring Based Application - Migrating to Junit 5

This is a quick write-up on migrating a Gradle based Spring Boot app from Junit 4 to the shiny new Junit 5. Junit 4 tests continue to work with Junit 5 Test Engine abstraction which provides support for tests written in different programming models, in this instance, Junit 5 supports a Vintage Test Engine with the ability to run JUnit 4 tests.


Here is a sample project with JUnit 5 integrations already in place along with sample tests in Junit 4 and Junit 5 - https://github.com/bijukunjummen/boot2-with-junit5-sample

Sample Junit 4 candidate test

As a candidate project, I have a Spring Boot 2 app with tests written in Kotlin using Junit 4 as the testing framework. This is how a sample test looks with all dependencies explicitly called out. It uses the Junit4's @RunWith annotation to load up the Spring Context:

import org.assertj.core.api.Assertions.assertThat
import org.junit.Test
import org.junit.runner.RunWith
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.boot.test.autoconfigure.web.reactive.WebFluxTest
import org.springframework.test.context.junit4.SpringRunner
import org.springframework.test.web.reactive.server.WebTestClient
import java.nio.charset.StandardCharsets

@RunWith(SpringRunner::class)
@WebFluxTest(controllers = arrayOf(RouteConfig::class))
class SampleJunit4Test {

    @Autowired
    lateinit var webTestClient: WebTestClient

    @Test
    fun `get of hello URI should return Hello World!`() {
        webTestClient.get()
                .uri("/hello")
                .exchange()
                .expectStatus().isOk
                .expectBody()
                .consumeWith({ m ->
                    assertThat(String(m.responseBodyContent, StandardCharsets.UTF_8)).isEqualTo("Hello World!")
                })

    }

}

the Junit 4 dependencies are pulled in transitively via "spring-boot-starter-test" module:

testCompile('org.springframework.boot:spring-boot-starter-test')


Junit 5 migration


The first step to do is to pull in the Junit 5 dependencies along with Gradle plugin which enables running the tests:

Plugin:

buildscript {
 dependencies {
  ....
  classpath 'org.junit.platform:junit-platform-gradle-plugin:1.0.2'
 }
}
apply plugin: 'org.junit.platform.gradle.plugin'

Dependencies:

testCompile("org.junit.jupiter:junit-jupiter-api")
testRuntime("org.junit.jupiter:junit-jupiter-engine")
testRuntime("org.junit.vintage:junit-vintage-engine:4.12.2")

With these changes in place, all the Junit 4 tests will continue to run both in IDE and when the Gradle build is executed and at this point, the tests itself can be slowly migrated over.

The test which I had shown before looks like this with Junit 5 Jupiter which provides the programming model for the tests:

import org.junit.jupiter.api.Assertions.assertEquals
import org.junit.jupiter.api.Test
import org.junit.jupiter.api.extension.ExtendWith
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.boot.test.autoconfigure.web.reactive.WebFluxTest
import org.springframework.test.context.junit.jupiter.SpringExtension
import org.springframework.test.web.reactive.server.WebTestClient
import java.nio.charset.StandardCharsets

@ExtendWith(SpringExtension::class)
@WebFluxTest(controllers = arrayOf(RouteConfig::class))
class SampleJunit5Test {

    @Autowired
    lateinit var webTestClient: WebTestClient

    @Test
    fun `get of hello URI should return Hello World!`() {
        webTestClient.get()
                .uri("/hello")
                .exchange()
                .expectStatus().isOk
                .expectBody()
                .consumeWith({ m ->
                    assertEquals("Hello World!", String(m.responseBodyContent, StandardCharsets.UTF_8))
                })
    }

}

Note that now instead of using JUnit 4 @RunWith annotation, I am using the @ExtendWith annotation and providing SpringExtension as a parameter which is responsible for loading up the Spring Context like before. Rest of the Spring annotations will continue to work with JUnit 5. This way tests can be slowly moved over from JUnit 4 to JUnit 5.


Caveats

Not everything is smooth though, there are a few issues in migrating from JUnit 4 to JUnit 5, the biggest of them is likely the support for JUnit @Rule and @ClassRule annotation and the JUnit 5 documentation does go into details on how it can be mitigated.

Monday, January 1, 2018

Kotlin - Tuple type

It is very simple to write a Tuple type with the expressiveness of Kotlin. My objective expressed in tests is the following:

1. Be able to define a Tuple of up to 5 elements and be able to retrieve the elements using an index like placeholder, in a test expressed with 2 elements like this:

val tup = Tuple("elem1", "elem2")
assertThat(tup._1).isEqualTo("elem1")
assertThat(tup._2).isEqualTo("elem2")

2. Be able to de-construct the constituent types along the following lines:
val tup = Tuple("elem1", "elem2")
val (e1, e2) = tup

assertThat(e1).isEqualTo("elem1")
assertThat(e2).isEqualTo("elem2")


Implementation

The implementation for a tuple of 2 elements is the following in its entirity:

data class Tuple2<out A, out B>(val _1: A, val _2: B)

A Kotlin data class provides all the underlying support of being able to retrieve the individual fields and the ability to destructure using an expression like this:

val (e1, e2) = Tuple2("elem1", "elem2")

All that I need to do at this point is provide a helper that creates a tuple of appropriate size based on the number of arguments provided which I have defined as follows:

object Tuple {
    operator fun <A> invoke(_1: A): Tuple1<A> = Tuple1(_1)
    operator fun <A, B> invoke(_1: A, _2: B): Tuple2<A, B> = Tuple2(_1, _2)
    operator fun <A, B, C> invoke(_1: A, _2: B, _3: C): Tuple3<A, B, C> = Tuple3(_1, _2, _3)
    operator fun <A, B, C, D> invoke(_1: A, _2: B, _3: C, _4: D): Tuple4<A, B, C, D> = Tuple4(_1, _2, _3, _4)
    operator fun <A, B, C, D, E> invoke(_1: A, _2: B, _3: C, _4: D, _5: E): Tuple5<A, B, C, D, E> = Tuple5(_1, _2, _3, _4, _5)
}

which allows me to define tuples of different sizes using a construct which looks like this:

val tup2 = Tuple("elem1", "elem2")
val tup3 = Tuple("elem1", "elem2", "elem3")
val tup4 = Tuple("elem1", "elem2", "elem3", "elem4")

A little more twist, typically a Pair type is an alias for Tuple with 2 elements and Triple is an alias for a Tuple of 3 elements, this can be trivially defined in Kotlin the following way:

typealias Pair<A, B> = Tuple2<A, B>
typealias Triple<A, B, C> = Tuple3<A, B, C>

Simple indeed! a more filled-in sample is available in my github repo here - https://github.com/bijukunjummen/kfun

Tuesday, December 26, 2017

Kotlin - Try type for functional exception handling

Scala has a Try type to functionally handle exceptions. I could get my head around using this type using the excellent Neophyte's guide to Scala by Daniel Westheide. This post will replicate this type using Kotlin.


Background


Consider a simple function which takes two String, converts them to integer and then divides them(sample based on scaladoc of Try) :

fun divide(dividend: String, divisor: String): Int {
     val num = dividend.toInt()
     val denom = divisor.toInt()
     return num / denom
 }

It is the callers responsibility to ensure that any exception that is propagated from this implementation is handled appropriately using the exception handling mechanism of Java/Kotlin:

try {
    divide("5t", "4")
} catch (e: ArithmeticException) {
    println("Got an exception $e")
} catch (e: NumberFormatException) {
    println("Got an exception $e")
}

My objective with the "Try" code will be to transform the "divide" to something which looks like this:

fun divideFn(dividend: String, divisor: String): Try<Int> {
    val num = Try { dividend.toInt() }
    val denom = Try { divisor.toInt() }
    return num.flatMap { n -> denom.map { d -> n / d } }
}

A caller of this variant of "divide" function will not have an exception to handle through a try/catch block, instead, it will get back the exception as a value which it can introspect and act on as needed.

val result = divideFn("5t", "4")
when(result) {
    is Success -> println("Got ${result.value}")
    is Failure -> println("An error : ${result.e}") 
}

Kotlin implementation

The "Try" type has two implementations corresponding to the "Success" path or a "Failure" path and implemented as a sealed class the following way:

sealed class Try<out T> {}

data class Success<out T>(val value: T) : Try<T>() {}

data class Failure<out T>(val e: Throwable) : Try<T>() {}

The "Success" type wraps around the successful result of an execution and "Failure" type wraps any exception thrown from the execution.

So now, to add some meat to these, my first test is to return one of these types based on a clean and exceptional implementation, along these lines:

val trySuccessResult: Try<Int> = Try {
    4 / 2
}
assertThat(trySuccessResult.isSuccess()).isTrue()


val tryFailureResult: Try<Int> = Try {
    1 / 0
}
assertThat(tryFailureResult.isFailure()).isTrue()

This can be achieved through a "companion object" in Kotlin, similar to static methods in Java, it returns either a Success type or a Failure type based on the execution of the lambda expression:

sealed class Try<out T> {
    ...    
    companion object {
        operator fun <T> invoke(body: () -> T): Try<T> {
            return try {
                Success(body())
            } catch (e: Exception) {
                Failure(e)
            }
        }
    }
    ...
}

Now that a caller has a "Try" type, they can check whether it is a "Success" type or a "Failure" type using the "when" expression like before, or using "isSuccess" and "isFailure" methods which are delegated to the sub-types like this:

sealed class Try<out T> {
    abstract fun isSuccess(): Boolean
    abstract fun isFailure(): Boolean
}

data class Success<out T>(val value: T) : Try<T>() {
    override fun isSuccess(): Boolean = true
    override fun isFailure(): Boolean = false
}

data class Failure<out T>(val e: Throwable) : Try<T>() {
    override fun isSuccess(): Boolean = false
    override fun isFailure(): Boolean = true
}

in case of Failure a default can be returned to the caller, something like this in a test:

val t1 = Try { 1 }

assertThat(t1.getOrElse(100)).isEqualTo(1)

val t2 = Try { "something" }
        .map { it.toInt() }
        .getOrElse(100)

assertThat(t2).isEqualTo(100)

again implemented by delegating to the subtypes:
sealed class Try<out T> {
  abstract fun get(): T
  abstract fun getOrElse(default: @UnsafeVariance T): T
  abstract fun orElse(default: Try<@UnsafeVariance T>): Try<T>
}

data class Success<out T>(val value: T) : Try<T>() {
    override fun getOrElse(default: @UnsafeVariance T): T = value
    override fun get() = value
    override fun orElse(default: Try<@UnsafeVariance T>): Try<T> = this
}

data class Failure<out T>(val e: Throwable) : Try<T>() {
    override fun getOrElse(default: @UnsafeVariance T): T = default
    override fun get(): T = throw e
    override fun orElse(default: Try<@UnsafeVariance T>): Try<T> = default
}


The biggest advantage of returning a "Try" type, however, is in chaining further operations on the type.

Chaining with map and flatMap

"map" operation is passed a lambda expression to transform the value in some form - possibly even to a different type:

val t1 = Try { 2 }

val t2 = t1.map({ it * 2 }).map { it.toString()}

assertThat(t2).isEqualTo(Success("4"))

Here a number is being doubled and then converted to a string. If the initial Try were a "Failure" then the final value will simply return the "Failure" along the lines of this test:

val t1 = Try {
    2 / 0
}

val t2 = t1.map({ it * 2 }).map { it * it }

assertThat(t2).isEqualTo(Failure<Int>((t2 as Failure).e))


Implementing "map" is fairly straightforward:

sealed class Try<out T> {
    fun <U> map(f: (T) -> U): Try<U> {
        return when (this) {
            is Success -> Try {
                f(this.value)
            }
            is Failure -> this as Failure<U>
        }
    }
}


flatmap, on the other hand, takes in a lambda expression which returns another "Try" type and flattens the result back into a "Try" type, along the lines of this test:

val t1 = Try { 2 }

val t2 = t1
        .flatMap { i -> Try { i * 2 } }
        .flatMap { i -> Try { i.toString() } }

assertThat(t2).isEqualTo(Success("4"))

Implementing this is simple too, along the following lines:

sealed class Try<out T> {
    fun <U> flatMap(f: (T) -> Try<U>): Try<U> {
        return when (this) {
            is Success -> f(this.value)
            is Failure -> this as Failure<U>
        }
    }
}

The "map" and "flatMap" methods are the power tools of this type, allowing chaining of complex operations together focusing on the happy path.


Conclusion

Try is a powerful type allowing a functional handling of exceptions in the code. I have a strawman implementation using Kotlin available in my github repo here - https://github.com/bijukunjummen/kfun

Monday, December 18, 2017

Kotlin - tail recursion optimization

Kotlin compiler optimizes tail recursive calls with a few catches. Consider a rank function to search for the index of an element in a sorted array, implemented the following way using tail recursion and a test for it:

fun rank(k: Int, arr: Array<Int>): Int {
    tailrec fun rank(low: Int, high: Int): Int {
        if (low > high) {
            return -1
        }
        val mid = (low + high) / 2

        return when {
            (k < arr[mid]) -> rank(low, mid)
            (k > arr[mid]) -> rank(mid + 1, high)
            else -> mid
        }
    }

    return rank(0, arr.size - 1)
}

@Test
fun rankTest() {
    val array = arrayOf(2, 4, 6, 9, 10, 11, 16, 17, 19, 20, 25)
    assertEquals(-1, rank(100, array))
    assertEquals(0, rank(2, array))
    assertEquals(2, rank(6, array))
    assertEquals(5, rank(11, array))
    assertEquals(10, rank(25, array))
}

IntelliJ provides an awesome feature to show the bytecode of any Kotlin code, along the lines of the following screenshot:



A Kotlin equivalent of the type of bytecode that the Kotlin compiler generates is the following:

fun rankIter(k: Int, arr: Array<Int>): Int {
    fun rankIter(low: Int, high: Int): Int {
        var lo = low
        var hi = high
        while (lo <= hi) {
            val mid = (lo + hi)/2

            if (k < arr[mid]) {
                hi = mid
            } else if (k > arr[mid]){
                lo = mid + 1
            } else {
                return mid
            }

        }
        return -1
    }

    return rankIter(0, arr.size - 1)
}

the tail calls have been translated to a simple loop.


There are a few catches that I could see though:
1. The compiler has to be explicitly told which calls are tail-recursive using the "tailrec" modifier
2. Adding tailrec modifier to a non-tail-recursive function does not generate compiler errors, though a warning does appear during compilation step

Sunday, December 10, 2017

Spring Webflux - Writing Filters

Spring Webflux is the new reactive web framework available as part of Spring 5+.  The way filters were written in a traditional Spring MVC based application(Servlet Filter, HandlerInterceptor) is very different from the way a filter is written in a Spring Webflux based application and this post will briefly go over the WebFlux approach to Filters.

Approach 1 - WebFilter

The first approach using WebFilter affects all endpoints broadly and covers Webflux endpoints written in a functional style as well the endpoints that are written using an annotation style. A WebFilter in Kotlin look like this:

    @Bean
    fun sampleWebFilter(): WebFilter {
        return WebFilter { e: ServerWebExchange, c: WebFilterChain ->
            val l: MutableList<String> = e.getAttributeOrDefault(KEY, mutableListOf())
            l.add("From WebFilter")
            e.attributes.put(KEY, l)
            c.filter(e)
        }
    }

The WebFilter adds a request attribute with the value being a collection where the filter is just putting in a message that it has intercepted the request.

Approach 2 - HandlerFilterFunction


The second approach is more focused and covers only endpoints written using functional style. Here specific RouterFunctions can be hooked up with a filter, along these lines:

Consider a Spring Webflux endpoint defined the following way:

@Bean
fun route(): RouterFunction<*> = router {
    GET("/react/hello", { r ->
        ok().body(fromObject(
                Greeting("${r.attribute(KEY).orElse("[Fallback]: ")}: Hello")
        ))
    POST("/another/endpoint", TODO())
        
    PUT("/another/endpoint", TODO())
})
        
}

A HandlerFilterFunction which intercepts these API's alone can be added in a highly focused way along these lines:

fun route(): RouterFunction<*> = router {
    GET("/react/hello", { r ->
        ok().body(fromObject(
                Greeting("${r.attribute(KEY).orElse("[Fallback]: ")}: Hello")
        ))
    })
    
    POST("/another/endpoint", TODO())
    
    PUT("/another/endpoint", TODO())
    
}.filter({ r: ServerRequest, n: HandlerFunction<ServerResponse> ->
    val greetings: MutableList<String> = r.attribute(KEY)
            .map { v ->
                v as MutableList<String>
            }.orElse(mutableListOf())

    greetings.add("From HandlerFilterFunction")

    r.attributes().put(KEY, greetings)
    n.handle(r)
})

Note that there is no need to be explicit about the types in Kotlin, I have added it just to be clear about the types in some of the lambda expressions


Conclusion

The WebFilter approach and the HandlerFilterFunction are very different from the Spring WebMVC based approach of writing filters using Servlet Specs or using HandlerInterceptors and this post summarizes the new approaches - I have samples available in my git repo which goes over these in more detail.

Friday, December 1, 2017

Annotated controllers - Spring Web/Webflux and Testing

Spring Webflux and Spring Web are two entirely different web stacks. Spring Webflux, however, continues to support an annotation-based programming model

An endpoint defined using these two stacks may look similar but the way to test such an endpoint is fairly different and a user writing such an endpoint has to be aware of which stack is active and formulate the test accordingly.

Sample Endpoint

Consider a sample annotation based endpoint:


import org.springframework.web.bind.annotation.PostMapping
import org.springframework.web.bind.annotation.RequestBody
import org.springframework.web.bind.annotation.RequestMapping
import org.springframework.web.bind.annotation.RestController


data class Greeting(val message: String)

@RestController
@RequestMapping("/web")
class GreetingController {
    
    @PostMapping("/greet")
    fun handleGreeting(@RequestBody greeting: Greeting): Greeting {
        return Greeting("Thanks: ${greeting.message}")
    }
    
}


Testing with Spring Web

If Spring Boot 2 starters were used to create this application with Spring Web as the starter, specified using a Gradle build file the following way:

compile('org.springframework.boot:spring-boot-starter-web')

then the test of such an endpoint would be using a Mock web runtime, referred to as Mock MVC:

import org.junit.Test
import org.junit.runner.RunWith
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest
import org.springframework.test.context.junit4.SpringRunner
import org.springframework.test.web.servlet.MockMvc
import org.springframework.test.web.servlet.request.MockMvcRequestBuilders.post
import org.springframework.test.web.servlet.result.MockMvcResultMatchers.content


@RunWith(SpringRunner::class)
@WebMvcTest(GreetingController::class)
class GreetingControllerMockMvcTest {

    @Autowired
    lateinit var mockMvc: MockMvc

    @Test
    fun testHandleGreetings() {
        mockMvc
                .perform(
                        post("/web/greet")
                                .content(""" 
                                |{
                                |"message": "Hello Web"
                                |}
                            """.trimMargin())
                ).andExpect(content().json("""
                    |{
                    |"message": "Thanks: Hello Web"
                    |}
                """.trimMargin()))
    }
}


Testing with Spring Web-Flux

If on the other hand Spring-Webflux starters were pulled in, say with the following Gradle dependency:

compile('org.springframework.boot:spring-boot-starter-webflux')

then the test of this endpoint would be using the excellent WebTestClient class, along these lines:

import org.junit.Test
import org.junit.runner.RunWith
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.boot.test.autoconfigure.web.reactive.WebFluxTest
import org.springframework.http.HttpHeaders
import org.springframework.test.context.junit4.SpringRunner
import org.springframework.test.web.reactive.server.WebTestClient
import org.springframework.web.reactive.function.BodyInserters


@RunWith(SpringRunner::class)
@WebFluxTest(GreetingController::class)
class GreetingControllerTest {

    @Autowired
    lateinit var webTestClient: WebTestClient

    @Test
    fun testHandleGreetings() {
        webTestClient.post()
                .uri("/web/greet")
                .header(HttpHeaders.CONTENT_TYPE, "application/json")
                .body(BodyInserters
                        .fromObject(""" 
                                |{
                                |   "message": "Hello Web"
                                |}
                            """.trimMargin()))
                .exchange()
                .expectStatus().isOk
                .expectBody()
                .json("""
                    |{
                    |   "message": "Thanks: Hello Web"
                    |}
                """.trimMargin())
    }
}


Conclusion

It is easy to assume that since the programming model looks very similar using Spring Web and Spring Webflux stacks, that the tests for such a legacy test using Spring Web would continue over to Spring Webflux, this is however not true, as a developer we have to be mindful of the underlying stack that comes into play and formulate the test accordingly. I hope this post clarifies how such a test should be crafted.

Monday, November 20, 2017

Using Micrometer with Spring Boot 2

This is a very quick introduction to using the excellent Micrometer library to instrument a Spring Boot 2 based application and recording the metrics in Prometheus


Introduction

Micrometer provides a Java based facade over the client libraries that the different monitoring tools provide.

As an example consider Prometheus, if I were to integrate my Java application with Prometheus, I would have used the client library called Prometheus Client Java, used the data-structures(Counter, Gauge etc) to collect and provide data to Prometheus. If for any reason the monitoring system is changed, the code will have to be changed for the new system.

Micrometer attempts to alleviate this by providing a common facade that the applications use when writing code, binding to the monitoring system is purely a runtime concern and so changing Metrics system from Prometheus to say Datadog just requires changing a runtime library without needing any code changes.



Instrumenting a Spring Boot 2 Application

Nothing special needs to be done to get Micrometer support for a Spring Boot 2 based app, adding in the actuator starters pulls in Micrometer as a transitive dependency:

for eg. in a gradle based project this is sufficient:

dependencies {
    compile('org.springframework.boot:spring-boot-starter-actuator')
    ...
}

Additionally since the intention is to send the data to Prometheus a dependency has to be pulled in which provides the necessary Micrometer SPI's.


dependencies {
    ...
    runtime("io.micrometer:micrometer-registry-prometheus")
    ...
}

By default Micrometer provides a set of intelligent bindings which instruments the Spring based Web and Webflux endpoints and adds in meters to collect the duration, count of calls. Additionally it also provides bindings to collect JVM metrics - memory usage, threadpool, etc.

An application property needs to be enabled to expose an endpoint which Prometheus will use to scrape the metrics data:

endpoints:
  prometheus:
    enabled: true

If the application is brought up at this point, the "/applications/prometheus" endpoint should be available showing a rich set of metrics, the following is a sample on my machine:


The default metrics is very rich and should cover most of the common set of metrics requirements of an application, if additional metrics is required it can easily added in as shown in the following code snippet:

class MessageHandler {
    
    private val counter = Metrics.counter("handler.calls", "uri", "/messages")
    
    fun handleMessage(req: ServerRequest): Mono<ServerResponse> {
        return req.bodyToMono<Message>().flatMap { m ->
            counter.increment()
            ...
...
}

Integrating with Prometheus

Prometheus can be configured to scrape data from the endpoint exposed by the Spring Boot2 app, a snippet of Prometheus configuration looks like this:

scrape_configs:
  - job_name: 'myapp'
    metrics_path: /application/prometheus
    static_configs:
      - targets: ['localhost:8080']

This is not really a production configuration, in a production setting it may better to use a Prometheus Push Gateway to broker the collection of metrics.

Prometheus provides a basic UI to preview the information that it scrapes, it can be accessed by default at port 9090. Here is a sample graph with the data produced during a load test:



Conclusion

Micrometer makes it very easy to instrument an application and collect a good set of basic metrics which can be stored and visualized in Prometheus. If you are interested in following this further, I have a sample application using Micrometer available here - https://github.com/bijukunjummen/boot2-load-demo