Monday, August 21, 2017

Gradle Kotlin DSL

Gradle build scripts can now be written using a dsl with Kotlin Language. All the concepts that work with traditional gradle build translate to a very intuitive dsl in Kotlin and have two additional features - it is typesafe and the script has excellent IDE support using Intellij IDEA.

My experience with the Gralde Kotlin DSL is fairly limited - all of one build script which is the subject of this article.

If you want to simply see how a sample script looks, I have a sample github repo with just that here - https://github.com/bijukunjummen/cf-show-env


Just to compare:

1. Consider the way different plugins are applied with gradle:

plugins {
 id "com.github.pivotalservices.cf-app" version "1.0.9"
}

apply plugin: 'kotlin"
apply plugin: 'java'
apply plugin: 'org.springframework.boot'
apply from: 'gradle/gatling.gradle'


An equivalent kotlin dsl is the following:

plugins {
    id("com.github.pivotalservices.cf-app").version("1.0.9")
}

apply {
    plugin("kotlin")
    plugin("java")
    plugin("org.springframework.boot")    
    from("gradle/gatling.gradle")
}



2. Adding project dependencies:

dependencies {
    compile('org.springframework.boot:spring-boot-starter-actuator')
    compile('org.springframework.boot:spring-boot-devtools')
    compile('org.springframework.boot:spring-boot-starter-thymeleaf')
    compile('org.springframework.boot:spring-boot-starter-web')
    compile('com.google.guava:guava:19.0')
    compile("org.webjars:bootstrap:3.3.7")
    compile("org.webjars:jquery:3.1.1")
    compile("io.prometheus:simpleclient:${prometheus_client_version}")
    compile("io.prometheus:simpleclient_spring_boot:${prometheus_client_version}")
    compile('nz.net.ultraq.thymeleaf:thymeleaf-layout-dialect')
    testCompile('org.springframework.boot:spring-boot-starter-test')
}

an equivalent code using kotlin DSL:

dependencies {
    val prometheus_client_version = "0.0.21"

    compile("org.springframework.boot:spring-boot-starter-actuator")
    compile("org.springframework.boot:spring-boot-devtools")
    compile("org.springframework.boot:spring-boot-starter-thymeleaf")
    compile("org.springframework.boot:spring-boot-starter-web")
    compile("com.google.guava:guava:19.0")
    compile("org.webjars:bootstrap:3.3.7")
    compile("org.webjars:jquery:3.1.1")
    compile("io.prometheus:simpleclient:${prometheus_client_version}")
    compile("io.prometheus:simpleclient_spring_boot:${prometheus_client_version}")
    compile("nz.net.ultraq.thymeleaf:thymeleaf-layout-dialect")
    testCompile("org.springframework.boot:spring-boot-starter-test")
}

3. Configuring Plugins - I have a plugin which helps deploy applications to Cloud Foundry, and works off a configuration which looks like this, when expressed using normal gradle build:

cfConfig {
    //CF Details
    ccHost = "api.local.pcfdev.io"
    ccUser = "admin"
    ccPassword = "admin"
    org = "pcfdev-org"
    space = "pcfdev-space"

    //App Details
    name = "cf-show-env"
    hostName = "cf-show-env"
    filePath = "build/libs/cf-show-env-0.1.3-SNAPSHOT.jar"
    path = ""
    domain = "local.pcfdev.io"
    instances = 2
    memory = 1024
    timeout = 180

    //Env and services
    buildpack = "https://github.com/cloudfoundry/java-buildpack.git"


    environment = ["JAVA_OPTS": "-Djava.security.egd=file:/dev/./urandom", "SPRING_PROFILES_ACTIVE": "cloud"]

    cfService {
        name = "p-mysql"
        plan = "512mb"
        instanceName = "test-db"
    }
    
 
    
    cfUserProvidedService {
        instanceName = "mydb1"
        credentials = ["jdbcUri": "someuri1"]
    }
}

this can now be configured in a typesafe way with full auto-completion support in IntelliJ the following way using Kotlin DSL:

configure< CfPluginExtension> {
    //CF Details
    ccHost = "api.local.pcfdev.io"
    ccUser = "admin"
    ccPassword = "admin"
    org = "pcfdev-org"
    space = "pcfdev-space"

    //App Details
    name = "cf-show-env"
    hostName = "cf-show-env"
    filePath = "build/libs/cf-show-env-1.0.0-M1.jar"
    path = ""
    domain = "local.pcfdev.io"
    instances = 2
    memory = 1024
    timeout = 180

    //Env and services
    buildpack = "https://github.com/cloudfoundry/java-buildpack.git"

    environment = mapOf(
            "JAVA_OPTS" to "-Djava.security.egd=file:/dev/./urandom", 
            "SPRING_PROFILES_ACTIVE" to "cloud"
    )

    cfService(closureOf<CfService> {
        name = "p-mysql"
        plan = "512mb"
        instanceName = "test-db"
    })
    
    cfUserProvidedService(closureOf<CfUserProvidedService> { 
        instanceName = "myups"
        credentials = mapOf(
                "user" to "someuser",
                "uri" to "someuri"
        )
    })

}

4. And finally a straight task:
task "hello-world" {
    doLast {
        println("Hello World")
    }
}

task showAppUrls(dependsOn: "cf-get-app-detail") << {
    print "${project.cfConfig.applicationDetail}"
}

looks more or less the same in Kotlin DSL:

task("hello-world") {
    doLast {
        println("Hello World")
    }
}


task("showAppUrls").dependsOn("cf-get-app-detail").doLast {
       println(cfConfig);
}


I am excited about using Kotlin DSL to configure my gradle builds, there are a few quirks to keep in mind though - the Intellij support tends to be a little flaky, it took a few tries for the IDEA to start helping with the auto-completions, also I needed to google quite a bit and look at some of the sample projects in gradle kotlin dsl, all in all though this has an awesome potential.

Monday, August 14, 2017

Concourse caching for Java Maven and Gradle builds

Concourse CI 3.3.x has introduced the ability to cache paths between task runs. This feature helps speed up tasks which cache content in specific folders - here I will demonstrate how this feature can be used for speeding up maven and gradle based java builds.

The code and the pipeline that I am using for this post is available at my github repo here - https://github.com/bijukunjummen/ci-concourse-caching-sample

Let me start with the gradle build, if I were to build the project using a gradle wrapper using the following command:

./gradlew clean build

then gradle would download the dependent libraries into a ".gradle" folder in the users home folder by default. This location of this folder can be changed using a "GRADLE_USER_HOME" environment variable, which is what I will be using in a concourse task to control the location of a cached path.

A concourse task which builds my project looks like this:

---
platform: linux
image_resource:
  type: docker-image
  source:
    repository: openjdk
    tag: 8-jdk
inputs:
  - name: repo
outputs:
  - name: out
run:
  path: /bin/bash
  args:
    - repo/ci/tasks/build.sh

caches:
  - path: .gradle/
  - path: .m2/

params:
  PROJECT_TYPE: 

See the caches parameter is specified as ".gradle" above. So all I have to do now is to ensure that Gradle uses this location as its home folder, which I would do in my build script:

export ROOT_FOLDER=$( pwd )
export GRADLE_USER_HOME="${ROOT_FOLDER}/.gradle"


The process to cache maven resources for a maven build is along the same lines, maven caches the dependent jars in a location that can be specified in a variety of ways, the one I have used is to specify this location via a dynamically generated settings.xml file the following way:

M2_HOME=${HOME}/.m2
mkdir -p ${M2_HOME}

M2_LOCAL_REPO="${ROOT_FOLDER}/.m2"

mkdir -p "${M2_LOCAL_REPO}/repository"

cat > ${M2_HOME}/settings.xml <<EOF

<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
                          https://maven.apache.org/xsd/settings-1.0.0.xsd">
      <localRepository>${M2_LOCAL_REPO}/repository</localRepository>
</settings>

EOF

which is quite a bit of bash scripting, all it is doing is generating a settings.xml with a localRepository tag set to ".m2/repository" folder which is relative to the temporary folder created by concourse for the build and thus can be cached.

With these changes in place, the behavior is that the downloads happen for the first run of the task but then get cached for subsequent runs. In my local concourse set-up a gradle build taking about 2 mins for a first time build takes about 20 seconds for a subsequent build !

You can try out this feature in my demo project here - https://github.com/bijukunjummen/ci-concourse-caching-sample



Friday, July 28, 2017

Kotlintest and property based testing

I was very happy to see that Kotlintest, a port of the excellent scalatest in Kotlin, supports property based testing.

I was introduced to property based testing through the excellent "Functional programming in Scala" book.

The idea behind property based testing is simple - the behavior of a program is described as a property and the testing framework generates random data to validate the property. This is best illustrated with an example using the excellent scalacheck library:


import org.scalacheck.Prop.forAll
import org.scalacheck.Properties

object ListSpecification extends Properties("List") {
  property("reversing a list twice should return the list") = forAll { (a: List[Int]) =>
    a.reverse.reverse == a
  }
}

scalacheck would generate a random list(of integer) of varying sizes and would validate that this property holds for the lists. A similar specification expressed through Kotlintest looks like this:

import io.kotlintest.properties.forAll
import io.kotlintest.specs.StringSpec


class ListSpecification : StringSpec({
    "reversing a list twice should return the list" {
        forAll{ list: List<Int> ->
            list.reversed().reversed().toList() == list
        }
    }
})

If the generators have to be a little more constrained, say if we wanted to test this behavior on lists of integer in the range 1 to 1000 then an explicit generator can be passed in the following way, again starting with scalacheck:

import org.scalacheck.Prop.forAll
import org.scalacheck.{Gen, Properties}

object ListSpecification extends Properties("List") {
  val intList = Gen.listOf(Gen.choose(1, 1000))
  property("reversing a list twice should return the list") = forAll(intList) { (a: List[Int]) =>
    a.reverse.reverse == a
  }
}

and an equivalent kotlintest code:

import io.kotlintest.properties.Gen
import io.kotlintest.properties.forAll
import io.kotlintest.specs.StringSpec

class BehaviorOfListSpecs : StringSpec({
    "reversing a list twice should return the list" {
        val intList = Gen.list(Gen.choose(1, 1000))

        forAll(intList) { list ->
            list.reversed().reversed().toList() == list
        }
    }
})

Given this let me now jump onto another example from the scalacheck site, this time to illustrate a failure:

import org.scalacheck.Prop.forAll
import org.scalacheck.Properties

object StringSpecification extends Properties("String") {

  property("startsWith") = forAll { (a: String, b: String) =>
    (a + b).startsWith(a)
  }

  property("concatenate") = forAll { (a: String, b: String) =>
    (a + b).length > a.length && (a + b).length > b.length
  }

  property("substring") = forAll { (a: String, b: String, c: String) =>
    (a + b + c).substring(a.length, a.length + b.length) == b
  }
}

the second property described above is wrong - if two strings are concatenated together they are ALWAYS larger than each of the parts, this is not true if one of the strings is blank. If I were to run this test using scalacheck it correctly catches this wrongly specified behavior:

+ String.startsWith: OK, passed 100 tests.
! String.concatenate: Falsified after 0 passed tests.
> ARG_0: ""
> ARG_1: ""
+ String.substring: OK, passed 100 tests.
Found 1 failing properties.

An equivalent kotlintest is the following:

import io.kotlintest.properties.forAll
import io.kotlintest.specs.StringSpec

class StringSpecification : StringSpec({
    "startsWith" {
        forAll { a: String, b: String ->
            (a + b).startsWith(a)
        }
    }

    "concatenate" {
        forAll { a: String, b: String ->
            (a + b).length > a.length && (a + b).length > b.length
        }
    }

    "substring" {
        forAll { a: String, b: String, c: String ->
            (a + b + c).substring(a.length, a.length + b.length) == b
        }
    }
})

on running, it correctly catches the issue with concatenate and produces the following result:

java.lang.AssertionError: Property failed for

Y{_DZ<vGnzLQHf9|3$i|UE,;!%8^SRF;JX%EH+<5d:p`Y7dxAd;I+J5LB/:O)

 at io.kotlintest.properties.PropertyTestingKt.forAll(PropertyTesting.kt:27)

However there is an issue here, scalacheck found a simpler failure case, it does this by a process called "Test Case minimization" where in case of a failure it tries to find the smallest test case that can fail, something that the Kotlintest can learn from.


There are other features where Kotlintest lags with respect to scalacheck, a big one being able to combine generators:

case class Person(name: String, age: Int)

val genPerson = for {
  name <- Gen.alphaStr
  age <- Gen.choose(1, 50)
} yield Person(name, age)

genPerson.sample

However all in all, I have found the DSL of Kotlintest and its support for property based testing to be a good start so far and look forward to how this library evolves over time.

If you want to play with these samples a little more, it is available in my github repo here - https://github.com/bijukunjummen/kotlintest-scalacheck-sample

Friday, July 14, 2017

Cloud Foundry Application manifest using Kotlin DSL

I had a blast working with and getting my head around the excellent support for creating DSL's in Kotlin Language.
Kotlin DSL is now being used for creating gradle build files, for defining routes in Spring Webflux, for creating html templates using kotlinx.html library.

Here I am going to demonstrate creating a kotlin based DSL to represent a Cloud Foundry Application Manifest content.

A sample manifest looks like this when represented as a yaml file:
applications:
 - name: myapp
   memory: 512M
   instances: 1
   path: target/someapp.jar
   routes:
     - somehost.com
     - antother.com/path
   envs:
    ENV_NAME1: VALUE1
    ENV_NAME2: VALUE2

And here is the kind of DSL I am aiming for:

cf {
    name = "myapp"
    memory = 512(M)
    instances = 1
    path = "target/someapp.jar"
    routes {
        +"somehost.com"
        +"another.com/path"
    }
    envs {
        env["ENV_NAME1"] = "VALUE1"
        env["ENV_NAME2"] = "VALUE2"
    }
}


Getting the basic structure


Let me start with a simpler structure that looks like this:


cf {
    name = "myapp"
    instances = 1
    path = "target/someapp.jar"
}

and want this kind of a DSL to map to a structure which looks like this:

data class CfManifest(
        var name: String = "",
        var instances: Int? = 0,
        var path: String? = null
)

It would translate to a Kotlin function which takes a Lambda expression:

fun cf(init: CfManifest.() -> Unit) {
 ...
}


The parameter which looks like this:
() -> Unit
is fairly self-explanatory, a lambda expression which does not take any parameters and does not return anything.

The part that took a while to seep into my mind is this modified lambda expression, referred to as a Lambda expression with receiver:

CfManifest.() -> Unit

It does two things the way I have understood it:

1. It defines in the scope of the wrapped function an extension function for the receiver type - in my case the CfManifest class
2. this within the lambda expression now refers to the receiver function.

Given this, the cf function translates to :

fun cf(init: CfManifest.() -> Unit): CfManifest {
    val manifest = CfManifest()
    manifest.init()
    return manifest
}

which can be succinctly expressed as:

fun cf(init: CfManifest.() -> Unit) = CfManifest().apply(init)

so now when I call:
cf {
    name = "myapp"
    instances = 1
    path = "target/someapp.jar"
}

It translates to:
CFManifest().apply {
  this.name = &quot;myapp&quot;
  this.instances = 1
  this.path = &quot;target/someapp.jar&quot;
}

More DSL

Expanding on the basic structure:

cf {
    name = "myapp"
    memory = 512(M)
    instances = 1
    path = "target/someapp.jar"
    routes {
        +"somehost.com"
        +"another.com/path"
    }
    envs {
        env["ENV_NAME1"] = "VALUE1"
        env["ENV_NAME2"] = "VALUE2"
    }
}

The routes and the envs in turn become methods on the CfManifest class and look like this:

data class CfManifest(
        var name: String = "",
        var path: String? = null,
        var memory: MEM? = null,
        ...
        var routes: ROUTES? = null,
        var envs: ENVS = ENVS()
) {

    fun envs(block: ENVS.() -> Unit) {
        this.envs = ENVS().apply(block)
    }

    ...

    fun routes(block: ROUTES.() -> Unit) {
        this.routes = ROUTES().apply(block)
    }
}

data class ENVS(
        var env: MutableMap<String, String> = mutableMapOf()
)

data class ROUTES(
        private val routes: MutableList<String> = mutableListOf()
) {
    operator fun String.unaryPlus() {
        routes.add(this)
    }
}

See how the routes method takes in a Lambda expression with a receiver type of ROUTES, this allows me to define an expression like this:

cf {
    ...
    routes {
        +"somehost.com"
        +"another.com/path"
    }
    ...
}

Another trick here is way a route is being added is using :

+"somehost.com"

which is enabled using a Kotlin convention which translates specific method names to operators, here the unaryPlus method. The cool thing for me is that this operator is visible only in the scope of ROUTES instance!


Another feature of the DSL making use of Kotlin features is the way a memory is specified, there are two parts to it - a number and the modifier, 2G, 500M etc.
This is being specified in a slightly modified way via the DSL as 2(G) and 500(M).

The way it is implemented is using another Kotlin convention where if a class has an invoke method then instances can call it the following way:

class ClassWithInvoke() {
    operator fun invoke(n: Int): String = "" + n
}
val c = ClassWithInvoke()
c(10)

So implementing invoke method as an extension function on Int in the scope of the CFManifest class allows this kind of a DSL:

data class CfManifest(
        var name: String = "",
        ...
) {
    ...
    operator fun Int.invoke(m: MemModifier): MEM = MEM(this, m)
}


This is pure experimentation on my part, I am both new to Kotlin as well as Kotlin DSL's so very likely there are a lot of things that can be improved in this implementation, any feedback and suggestions are welcome. You can play with this sample code at my github repo here

Tuesday, June 27, 2017

Spring Webflux - Kotlin DSL

Spring Webflux has introduced a feature for defining functional application endpoints using a very intuitive Kotlin based DSL

This post will be to simply show a contrasting api defined using a Java based fluent api and a Kotlin based DSL


A functional way to define a CRUD based Spring Webflux endpoint in Java would look like this:


RouterFunction<?> apis() {
    return nest(path("/hotels"), nest(accept(MediaType.APPLICATION_JSON),
            route(
                    GET("/"), messageHandler::getMessages)
                    .andRoute(POST("/"), messageHandler::addMessage)
                    .andRoute(GET("/{id}"), messageHandler::getMessage)
                    .andRoute(PUT("/{id}"), messageHandler::updateMessage)
                    .andRoute(DELETE("/{id}"), messageHandler::deleteMessage)
    ));
}

The details of the endpoint is very clear and is defined in a fluent manner with just a few keywords - route, nest and the HTTP verbs.

These endpoints can be expressed using a Kotlin based DSL(and some clever use of Kotlin extension functions) the following way:

@Bean
fun apis() = router {
    (accept(APPLICATION_JSON) and "/messages").nest {
        GET("/", messageHandler::getMessages)
        POST("/", messageHandler::addMessage)
        GET("/{id}", messageHandler::getMessage)
        PUT("/{id}", messageHandler::updateMessage)
        DELETE("/{id}", messageHandler::deleteMessage)
    }
}

I feels that this reads a little better than the Java based DSL. If the API is more complicated, as demonstrated in the excellent samples by S├ębastien Deleuze with multiple levels of nesting, the Kotlin based DSL really starts to shine.


In the next post, I will delve into how this support has been implemented.

This sample is available in my github repo here

Sunday, June 11, 2017

Spring Boot Web Slice test - Sample

Spring Boot introduced test slicing a while back and it has taken me some time to get my head around it and explore some of its nuances.

Background


The main reason to use this feature is to reduce the boilerplate. Consider a controller that looks like this, just for variety written using Kotlin.

@RestController
@RequestMapping("/users")
class UserController(
        private val userRepository: UserRepository,
        private val userResourceAssembler: UserResourceAssembler) {

    @GetMapping
    fun getUsers(pageable: Pageable, 
                 pagedResourcesAssembler: PagedResourcesAssembler<User>): PagedResources<Resource<User>> {
        val users = userRepository.findAll(pageable)
        return pagedResourcesAssembler.toResource(users, this.userResourceAssembler)
    }

    @GetMapping("/{id}")
    fun getUser(id: Long): Resource<User> {
        return Resource(userRepository.findOne(id))
    }
}


A traditional Spring Mock MVC test to test this controller would be along these lines:

@RunWith(SpringRunner::class)
@WebAppConfiguration
@ContextConfiguration
class UserControllerTests {

    lateinit var mockMvc: MockMvc

    @Autowired
    private val wac: WebApplicationContext? = null

    @Before
    fun setup() {
        this.mockMvc = MockMvcBuilders.webAppContextSetup(this.wac).build()
    }

    @Test
    fun testGetUsers() {
        this.mockMvc.perform(get("/users")
                .accept(MediaType.APPLICATION_JSON))
                .andDo(print())
                .andExpect(status().isOk)
    }

    @EnableSpringDataWebSupport
    @EnableWebMvc
    @Configuration
    class SpringConfig {

        @Bean
        fun userController(): UserController {
            return UserController(userRepository(), UserResourceAssembler())
        }

        @Bean
        fun userRepository(): UserRepository {
            val userRepository = Mockito.mock(UserRepository::class.java)
            given(userRepository.findAll(Matchers.any(Pageable::class.java)))
                    .willAnswer({ invocation ->
                        val pageable = invocation.arguments[0] as Pageable
                        PageImpl(
                                listOf(
                                        User(id = 1, fullName = "one", password = "one", email = "one@one.com"),
                                        User(id = 2, fullName = "two", password = "two", email = "two@two.com"))
                                , pageable, 10)
                    })
            return userRepository
        }
    }
}

There is a lot of ceremony involved in setting up such a test - a web application context which understands a web environment is pulled in, a configuration which sets up the Spring MVC environment needs to be created and MockMvc which is handle to the testing framework needs to be set-up before each test.


Web Slice Test

A web slice test when compared to the previous test is far simpler and focuses on testing the controller and hides a lot of the boilerplate code:

@RunWith(SpringRunner::class)
@WebMvcTest(UserController::class)
class UserControllerSliceTests {

    @Autowired
    lateinit var mockMvc: MockMvc

    @MockBean
    lateinit var userRepository: UserRepository

    @SpyBean
    lateinit var userResourceAssembler: UserResourceAssembler

    @Test
    fun testGetUsers() {

        this.mockMvc.perform(get("/users").param("page", "0").param("size", "1")
                .accept(MediaType.APPLICATION_JSON))
                .andDo(print())
                .andExpect(status().isOk)
    }

    @Before
    fun setUp(): Unit {
        given(userRepository.findAll(Matchers.any(Pageable::class.java)))
                .willAnswer({ invocation ->
                    val pageable = invocation.arguments[0] as Pageable
                    PageImpl(
                            listOf(
                                    User(id = 1, fullName = "one", password = "one", email = "one@one.com"),
                                    User(id = 2, fullName = "two", password = "two", email = "two@two.com"))
                            , pageable, 10)
                })
    }
}

It works by creating a Spring Application context but filtering out anything that is not relevant to the web layer and loading up only the controller which has been passed into the @WebTest annotation. Any dependency that the controller requires can be injected in as a mock.


Coming to some of the nuances, say if I wanted to inject one of the fields myself the way to do it is have the test use a custom Spring Configuration, for a test this is done by using a inner static class annotated with @TestConfiguration the following way:

@RunWith(SpringRunner::class)
@WebMvcTest(UserController::class)
class UserControllerSliceTests {

    @Autowired
    lateinit var mockMvc: MockMvc

    @Autowired
    lateinit var userRepository: UserRepository

    @Autowired
    lateinit var userResourceAssembler: UserResourceAssembler

    @Test
    fun testGetUsers() {

        this.mockMvc.perform(get("/users").param("page", "0").param("size", "1")
                .accept(MediaType.APPLICATION_JSON))
                .andDo(print())
                .andExpect(status().isOk)
    }

    @Before
    fun setUp(): Unit {
        given(userRepository.findAll(Matchers.any(Pageable::class.java)))
                .willAnswer({ invocation ->
                    val pageable = invocation.arguments[0] as Pageable
                    PageImpl(
                            listOf(
                                    User(id = 1, fullName = "one", password = "one", email = "one@one.com"),
                                    User(id = 2, fullName = "two", password = "two", email = "two@two.com"))
                            , pageable, 10)
                })
    }

    @TestConfiguration
    class SpringConfig {

        @Bean
        fun userResourceAssembler(): UserResourceAssembler {
            return UserResourceAssembler()
        }

        @Bean
        fun userRepository(): UserRepository {
            return mock(UserRepository::class.java)
        }
    }

}


The beans from the "TestConfiguration" adds on to the configuration which the Slice tests depend on and don't completely replace it.

On the other hand, if I wanted to override the loading of the main "@SpringBootApplication" annotated class then I can pass in a Spring Configuration class explicitly, but the catch is that I have to now take care of all of loading up the relevant Spring Boot features myself (enabling auto-configuration, appropriate scanning etc), so a way around it to explicitly annotate the configuration as a Spring Boot Application the following way:

@RunWith(SpringRunner::class)
@WebMvcTest(UserController::class)
class UserControllerExplicitConfigTests {

    @Autowired
    lateinit var mockMvc: MockMvc

    @Autowired
    lateinit var userRepository: UserRepository

    @Test
    fun testGetUsers() {

        this.mockMvc.perform(get("/users").param("page", "0").param("size", "1")
                .accept(MediaType.APPLICATION_JSON))
                .andDo(print())
                .andExpect(status().isOk)
    }

    @Before
    fun setUp(): Unit {
        given(userRepository.findAll(Matchers.any(Pageable::class.java)))
                .willAnswer({ invocation ->
                    val pageable = invocation.arguments[0] as Pageable
                    PageImpl(
                            listOf(
                                    User(id = 1, fullName = "one", password = "one", email = "one@one.com"),
                                    User(id = 2, fullName = "two", password = "two", email = "two@two.com"))
                            , pageable, 10)
                })
    }

    @SpringBootApplication(scanBasePackageClasses = arrayOf(UserController::class))
    @EnableSpringDataWebSupport
    class SpringConfig {

        @Bean
        fun userResourceAssembler(): UserResourceAssembler {
            return UserResourceAssembler()
        }

        @Bean
        fun userRepository(): UserRepository {
            return mock(UserRepository::class.java)
        }
    }

}


The catch though is that now other tests may end up finding this inner configuration which is far from ideal!, so my learning has been to depend on bare minimum slice testing, and if needed extend it using @TestConfiguration.


I have a little more detailed code sample available at my github repo which has working examples to play with.

Tuesday, May 30, 2017

Ratio based routing to a legacy and a modern app - Netflix Zuul via Spring Cloud

A very common requirement when migrating from a legacy version of an application to a modernized version of the application is to be able to migrate the users slowly over to the new application. In this post I will be going over this kind of a routing layer written using support for Netflix Zuul through Spring Cloud . Before I go ahead I have to acknowledge that most of the code demonstrated here has been written in collaboration with the superlative Shaozhen Ding


Scenario

I have a legacy service which has been re-engineered to a more modern version(assumption is that as part of this migration the uri's of the endpoints have not changed). I want to migrate users slowly over from the legacy application over to the modern version.


Implementation using Spring Cloud Netflix - Zuul Support


This can be easily implemented using Netflix Zuul support in Spring Cloud project.

Zuul is driven by a set of filters which act on a request before(pre filters), during(route filters) and after(post filters) a request to a backend. Spring Cloud adds it custom set of filters to Zuul and drives the behavior of these filters by configuration that looks like this:

zuul:
  routes:
    ratio-route:
      path: /routes/**
      strip-prefix: false

This specifies that Zuul will be handling a request to Uri with prefix "/routes" and this prefix will not be stripped from the downstream call. This logic is encoded into a "PreDecorationFilter". My objective is to act on the request AFTER the PreDecorationFilter and specify the backend to be either the legacy version or the modern version. Given this a filter which acts on the request looks like this:

import com.netflix.zuul.ZuulFilter;
import com.netflix.zuul.context.RequestContext;
...

@Service
public class RatioBasedRoutingZuulFilter extends ZuulFilter {

    public static final String LEGACY_APP = "legacy";
    public static final String MODERN_APP = "modern";
    
    private Random random = new Random();
    
    @Autowired
    private RatioRoutingProperties ratioRoutingProperties;

    @Override
    public String filterType() {
        return "pre";
    }

    @Override
    public int filterOrder() {
        return FilterConstants.PRE_DECORATION_FILTER_ORDER + 1;
    }

    @Override
    public boolean shouldFilter() {
        RequestContext ctx = RequestContext.getCurrentContext();
        return ctx.containsKey(SERVICE_ID_KEY)
                && ctx.get(SERVICE_ID_KEY).equals("ratio-route");
    }

    @Override
    public Object run() {
        RequestContext ctx = RequestContext.getCurrentContext();

        if (isTargetedToLegacy()) {
            ctx.put(SERVICE_ID_KEY, LEGACY_APP);
        } else {
            ctx.put(SERVICE_ID_KEY, MODERN_APP);
        }
        return null;
    }

    boolean isTargetedToLegacy() {
        return random.nextInt(100) < ratioRoutingProperties.getOldPercent();
    }
}

The filter is set to act after the "PreDecorationFilter" by overriding the filterOrder() method. The routing logic is fairly naive but should work for most cases. The serviceId being set in the Zuul context has a value of "legacy" or "modern" and represents a "named" Ribbon client, a handle using which the details of the backend can be set. So with Spring Cloud, my named clients are mapped to the legacy and modern versions of the app the following way:


legacy:
  ribbon:
    listOfServers: http://localhost:8081

modern:
  ribbon:
    DeploymentContextBasedVipAddresses: modern-app

Here just for a little more variation I am making a direct call to an endpoint for the legacy app and making a call via Eureka for the modern version of the application.


If you are interested in exploring the entirety of the application it is available in my github repo


With the entire set-up in place, a small test with the legacy handling 20% of the traffic confirms that the filter works effectively:

Conclusion

Spring Cloud support for Netflix Zuul makes handling such routing scenarios a cinch and should be a good fit for any organization having these kinds of routing scenarios that they may want to implement.