June 7, 2017

Ratpacked: Assert No Exceptions Are Thrown With RequestFixture

Writing unit tests for our handlers in Ratpack is easy with RequestFixture. We invoke the handle method and use a Handler or Chain we want to test as argument. We can provide extra details on the fixture instance with a second argument, for example adding objects to the registry or setting the request method. The handle method returns a HandlingResult object. This object has the method exception that we can use to see if an exception occurred in our code under test. The method throws a HandlerExceptionNotThrownException if the expected exception doesn't occurr.

In the following example we have two feature methods to check if an exception occurred or not:

package sample

import ratpack.handling.Context
import ratpack.handling.Handler
import ratpack.test.handling.RequestFixture
import spock.lang.Specification

class HandlerSpec extends Specification {

    def 'check exception is thrown'() {
        given:
        def result = RequestFixture.handle new SampleHandler(true), Action.noop()

        expect:
        result.exception(Exception).message == 'Sample exception'
    }

    def 'check no exception is thrown'() {
        given:
        def result = RequestFixture.handle new SampleHandler(false), Action.noop()
        
        when:
        result.exception(Exception)

        then:
        thrown(HandlerExceptionNotThrownException)
    }
    
}

class SampleHandler implements Handler {
    
    /**
     * Indicate if we need to create an 
     * error with an exception or not.
     */
    private final boolean throwException = false

    SampleHandler(final boolean throwException) {
        this.throwException = throwException
    }

    @Override
    void handle(final Context ctx) throws Exception {
        if (throwException) {
            // Throw a sample exception.
            ctx.error(new Exception('Sample exception'))
            ctx.response.send()
        } else {
            // No exceptions.
            ctx.response.send('OK')
        }
    }
    
}

Instead of using the exception method of HandlingResult we can add a custom ServerErrorHandler to the fixture registry. Exceptions are handled by the error handler and we can check if an exception occurred or not via the error handler. In the following code we use a custom error handler:

package sample

import ratpack.error.ServerErrorHandler
import ratpack.handling.Context
import ratpack.handling.Handler
import ratpack.test.handling.RequestFixture
import spock.lang.Specification

class HandlerSpec extends Specification {

    /**
     * Error handler to capture exceptions.
     */
    private specErrorHandler = new SpecErrorHandler()

    /**
     * Add error handler as {@link ServerErrorHandler}
     * implementation to the fixture registry.
     */
    private fixtureErrorHandler = { fixture ->
        fixture.registry.add ServerErrorHandler, specErrorHandler
    }
    
    def 'check exception is thrown'() {
        when:
        RequestFixture.handle new SampleHandler(true), fixtureErrorHandler

        then:
        specErrorHandler.exceptionThrown()
        
        and:
        specErrorHandler.throwable.message == 'Sample exception'
    }

    def 'check no exception is thrown'() {
        when:
        RequestFixture.handle new SampleHandler(false), fixtureErrorHandler

        then:
        specErrorHandler.noExceptionThrown()
    }
    
}

class SampleHandler implements Handler {
    
    /**
     * Indicate if we need to create an 
     * error with an exception or not.
     */
    private final boolean throwException = false

    SampleHandler(final boolean throwException) {
        this.throwException = throwException
    }

    @Override
    void handle(final Context ctx) throws Exception {
        if (throwException) {
            // Throw a sample exception.
            ctx.error(new Exception('Sample exception'))
            ctx.response.send()
        } else {
            // No exceptions.
            ctx.response.send('OK')
        }
    }
    
}

/**
 * Simple implementation for {@link ServerErrorHandler}
 * where we simply store the original exception and 
 * add utility methods to determine if an exception is
 * thrown or not.
 */
class SpecErrorHandler implements ServerErrorHandler {
    
    /**
     * Store original exception.
     */
    private Throwable throwable

    /**
     * Store exception in {@link #throwable} and 
     * set response status to {@code 500}.
     * 
     * @param context Context for request.
     * @param throwable Exception thrown in code.
     * @throws Exception Something goes wrong.
     */
    @Override
    void error(final Context context, final Throwable throwable) throws Exception {
        this.throwable = throwable
        context.response.status(500)
    }

    /**
     * @return {@code true} if error handler is invoked, {@code false} otherwise.
     */
    boolean exceptionThrown() {
        throwable != null
    }

    /**
     * @return {@code true} if error handler is not invoked, {@code false} otherwise.
     */
    boolean noExceptionThrown() {
        !exceptionThrown()
    }
    
}

Written with Ratpack 1.4.5.

June 2, 2017

Spocklight: Indicate Specification As Pending Feature

Sometimes we are working on a new feature in our code and we want to write a specification for it without yet really implementing the feature. To indicate we know the specification will fail while we are implementing the feature we can add the @PendingFeature annotation to our specification method. With this annotation Spock will still execute the test, but will set the status to ignored if the test fails. But if the test passes the status is set to failed. So when we have finished the feature we need to remove the annotation and Spock will kindly remind us to do so this way.

In the following example specification we use the @PendingFeature annotation:

package sample

import spock.lang.Specification
import spock.lang.PendingFeature
import spock.lang.Subject

class SampleSpec extends Specification {

    @Subject
    private final converter = new Converter()

    @PendingFeature
    void 'new feature to make String upper case'() {
        given:
        def value = 'Spock is awesome'

        expect: // This will fail as expected
        converter.upper(value) == 'SPOCK IS AWESOME'
    }

}

class Converter {
    String upper(String value) {
        value
    }
}

When we run our test in for example Gradle we get the following result:

Now let's implement the upper method:

package sample

import spock.lang.Specification
import spock.lang.PendingFeature
import spock.lang.Subject

class SampleSpec extends Specification {

    @Subject
    private final converter = new Converter()

    @PendingFeature
    void 'new feature to make String upper case'() {
        given:
        def value = 'Spock is awesome'

        expect: // This will fail no more
        converter.upper(value) == 'SPOCK IS AWESOME'
    }

}

class Converter {
    String upper(String value) {
        value.toUpperCase()
    }
}

We run the test again and now we get a failing result although our implementation of the upper method is correct:

So this tells us the @PendingFeature is no longer needed. We can remove it and the specification will pass correctly.

Written with Spock 1.1.

April 26, 2017

Awesome Asciidoctor: Nested Tables

Defining tables in Asciidoctor is very easy. The start and end of the table are defined by |===. But if we want to add a new table to a table cell we cannot use the same syntax. To define a nested table we must replace the | separator with !. So instead of |=== to indicate the table boundaries we use !===. Also the cell separators are now ! instead of |. Finally we must make sure the table cell or column supports Asciidoc markup, so the table is properly created. We must configure the cell or column with a so the nested table is created.

In the following example Asciidoctor markup we have a simple table with a nested table in the second column and row. Notice we can still apply all table configuration to the nested table as well:

= Tables

== Nested tables

To nest a table in a table we must
use `!` as table separator instead of `|`.
Also the type of the column or cell
must be set to `a` so Asciidoc markup
is processed.

[cols="1,2a"]
|===
| Col 1 | Col 2

| Cell 1.1
| Cell 1.2

| Cell 2.1
| Cell 2.2

[cols="2,1"]
!===
! Col1 ! Col2

! C11
! C12

!===

|===

When we run Asciidoctor to create HTML for this markup we get the following result:

Written with Asciidoctor 1.5.5.

April 21, 2017

Gradle Goodness: Using Incremental Task Action

Gradle has incremental build support to speed up our builds. This means Gradle checks input and output for a task and if something changed the task is executed, otherwise the task is skipped. In previous posts we learned how to add incremental build support to our tasks with annotations and inputs and outputs property of a task. When we have a task that has an output file for an input file, like with transformations, we can have a more efficient task using an incremental task action. With an incremental task action we have extra information on the files that are handled by the task. We can have different actions based on if an input file is out of date or removed. This way we can handle only the input files that have changed or removed with incremental builds, instead of all the input files.

To create an incremental task action we must have a task action method (annotated with @TaskAction) that has a single argument of type IncrementalTaskInputs. The IncrementalTaskInputs class has the method outOfDate and removed. These methods take an action, that can be implemented with a closure, with an instance of InputFileDetails as argument. We can get to the input file via this instance and use that for our task logic. When an input file is out of date, because the file contents has changed or the output file has been removed, the action we defined for the outOfDate method is invoked. If the input file is removed the action for the method removed is invoked.

In the following example we have a task HtmlConverter with a task action that support incremental builds for input files. When an input file has changed it is processed, otherwise it is skipped. In the build file we create the task convert that uses the HtmlConverter task class:

// Create task to convert text to HTML.
task convert(type: HtmlConverter) {
    sourceDir = file('src/docs/text')
    outputDir = file("${buildDir}/html")
}

import groovy.xml.MarkupBuilder
import groovy.transform.CompileStatic
import groovy.transform.CompileDynamic

/**
 * Simple Gradle task that takes an text
 * input file and converts it to a HTML file.
 */
@CompileStatic
class HtmlConverter extends DefaultTask {

    @InputDirectory
    @PathSensitive(PathSensitivity.RELATIVE)
    File sourceDir

    @OutputDirectory
    File outputDir

    /**
     * Task action that will check if a source file is out of date
     * or removed. If out of date the source file is converted
     * to HTML. If source file is removed the generated 
     * HTML file is removed.
     *
     * @param inputs Used for incremental task action.
     */
    @TaskAction
    void convert(IncrementalTaskInputs inputs) {
        // If the user for example used --rerun-tasks
        // this task is not incremental. Only
        // inputs.outOfDate is executed, so we must first 
        // remove all output files.
        if (!inputs.incremental) {
            project.delete(outputDir.listFiles())
        }

        // Input file has changed, so we convert it.
        inputs.outOfDate { InputFileDetails outOfDate ->
            convertFile(outOfDate.file)
        }

        // Input file is removed, so we remove the
        // output file that was created for the input file.
        inputs.removed { InputFileDetails removed ->
            removeOutputFile(removed.file)
        }
    }

    /**
     * Convert text file to HTML.
     *
     * @param file Text file to convert to HTML.
     */
    private void convertFile(final File file) {
        logger.lifecycle 'Convert file {}', file.name
        final lines = file.readLines()
        final outputWriter = new FileWriter(new File(outputDir, outputFilename(file)))
        writeHtml(lines, outputWriter)
    }

    /**
     * Use first line as title for HTML, rest is body.
     *
     * @param lines Lines to transform to HTML.
     * @param writer Writer to write HTML to.
     */
    @CompileDynamic
    private void writeHtml(final List lines, final Writer writer) {
        final html = new MarkupBuilder(writer)
        html.html {
            head {
                title lines[0]
            }
            body {
                lines[2..-1].each { line ->
                    p line
                }
            }
        }
    }

    /**
     * Remove the output file thas was created for the
     * given input file.
     *
     * @param file Input file to remove output file for.
     */
    private void removeOutputFile(final File file) {
        logger.lifecycle 'Remove HTML for file {}', file.name
        new File(outputDir, outputFilename(file)).delete()
    }

    /**
     * Determine HTML output filename based on base of input filename.
     *
     * @param file Used to create HTML output file name.
     */
    private String outputFilename(final File file) {
        file.name[0..file.name.lastIndexOf('.')] + 'html'
    }

}

In our project we have 3 source files in the directory src/docs/text: sample1.txt, sample2.txt and hello.txt. We run the convert task for the first time and we see all input files are processed:

$ gradle convert
:convert
Convert file hello.txt
Convert file sample1.txt
Convert file sample2.txt

BUILD SUCCESSFUL

Total time: 0.948 secs

Next we change hello.txt and when re-run the task we see only our changed file is processed. If we rename it after the change we can see the hello.html is removed and the new file is processed:

$ echo "Gradle rocks" >> src/docs/text/hello.txt
$ gradle convert
:convert
Convert file hello.txt

BUILD SUCCESSFUL

Total time: 0.793 secs
$ mv src/docs/text/hello.txt src/docs/text/sample.txt
$ gradle convert
:convert
Convert file sample.txt
Remove HTML for file hello.txt

BUILD SUCCESSFUL

Total time: 0.76 secs
$

Written with Gradle 3.5.

April 19, 2017

Gradle Goodness: Change Local Build Cache Directory

Gradle 3.5 introduced the build cache. With the build cache we can reuse task output from builds that can come from different computers. We can also use the build cache feature for our local builds. By default the directory to store the cache is located in the Gradle user home directory on our computer (USER_HOME/.gradle/caches/build-cache-1). We can change the directory for the local cache via settings.gradle of our Gradle project. For example we could configure a directory in our project file structure to be the build cache directory. Then it is easy to clean the cache, because it is a directory not shared by other Gradle projects. With the default directory location in the Gradle user home directory the caches of all Gradle projects we run on our computer are stored in a single directory. And the cache doesn't shrink and will only grow we might want to have more control of where the cache of a single Gradle project is stored. This way we can easily clean the cache, because all files of a project are stored in the directory for that project.

In the following example settings.gradle file we configure our build cache directory to be the directory build-cache in the root directory of our project where we store our settings.gradle file:

// File: settings.gradle
buildCache {
    local {
        // Set local build cache directory.
        directory = "${settingsDir}/build-cache"
    }
}

Written with Gradle 3.5.

April 14, 2017

Gradle Goodness: Enable Build Cache For All Builds

Gradle 3.5 introduced the build cache. With the build cache we can share task output between builds on different computers. For example the build output from a continuous integration server can be used on a developer's computer. To use the build cache feature we use the command-line option --build-cache. Instead of using the command-line option --build-cache we can set the Gradle property org.gradle.caching with the value true in the file gradle.properties of our project. To set this property for all our projects we set the property in the gradle.properties file in the Gradle home directory, which is usually at USER_HOME/.gradle/gradle.properties.

In the following example we set the property org.gradle.caching in ~/.gradle/gradle.properties:

# File: ~/.gradle/gradle.properties
org.gradle.caching=true

If we want to disable the build cache feature set via the global property we can use the command-line option --no-build-cache to disable the build cache for a particular build.

Written with Gradle 3.5.

April 12, 2017

Spring Sweets: Hiding Sensitive Environment Or Configuration Values From Actuator Endpoints

We can use Spring Boot Actuator to add endpoints to our application that can expose information about our application. For example we can request the /env endpoint to see which Spring environment properties are available. Or use /configprops to see the values of properties defined using @ConfigurationProperties. Sensitive information like passwords and keys are replaced with ******. Spring Boot Actuator has a list of properties that have sensitive information and therefore should be replaced with ******. The default list of keys that have their value hidden is defined as password,secret,key,token,.*credentials.*,vcap_services. A value is either what the property name ends with or a regular expression. We can define our own list of property names from which the values should be hidden or sanitized and replaced with ******. We define the key we want to be hidden using the application properties endpoints.env.keys-to-sanatize and endpoints.configprops.keys-to-sanatize.

In the following example Spring application YAML configuration we define new values for keys we want to be sanitized. Properties in our Spring environment that end with username or password should be sanatized. For properties set via @ConfigurationProperties we want to hide values for keys that end with port and key:

# File: src/main/resources/application.yml
endpoints:
  env:
    # Hide properties that end with password and username:
    keys-to-sanitize: password,username
  configprops:
    # Also hide port and key values from the output:
    keys-to-sanitize: port,key
---
# Extra properties will be exposed
# via /env endpoint.
sample:
  username: test
  password: test

When we request the /env we see in the output that values of properties that end with username and password are hidden:

...
    "applicationConfig: [classpath:/application.yml]": {
        ...
        "sample.password": "******",
        "sample.username": "******"
    },
...

When we request the /configprops we see in the output that for example key and port properties are sanitized:

...
    "spring.metrics.export-org.springframework.boot.actuate.metrics.export.MetricExportProperties": {
        "prefix": "spring.metrics.export",
        "properties": {
            ...
            "redis": {
                "key": "******",
                "prefix": "spring.metrics.application.f2325e314fc8223e6bb8ee6ddebbbd79"
            },
            "statsd": {
                "host": null,
                "port": "******",
                "prefix": null
            }
        }
    },
...

Written with Spring Boot 1.5.2.RELEASE.

April 11, 2017

Spocklight: Set Timeout On Specification Methods

When we write a feature method in our Spock specification to test our class we might run into long running methods that are invoked. We can specify a maximum time we want to wait for a method. If the time spent by the method is more than the maximum time our feature method must fail. Spock has the @Timeout annotation to define this. We can apply the annotation to our specification or to feature methods in the specification. We specify the timeout value as argument for the @Timeout annotation. Seconds are the default time unit that is used. If we want to specify a different time unit we can use the annotation argument unit and use constants from java.util.concurrent.TimeUnit to set a value.

In the following example specification we set a general timeout of 1 second for the whole specification. For two methods we override this default timeout with their own value and unit:

package mrhaki.spock

@Grab('org.spockframework:spock-core:1.0-groovy-2.4')
import spock.lang.Specification
import spock.lang.Subject
import spock.lang.Timeout

import static java.util.concurrent.TimeUnit.MILLISECONDS

// Set a timeout for all feature methods.
// If a feature method doesn't return in 1 second
// the method fails.
@Timeout(1)
class SampleSpec extends Specification {

    @Subject
    private final Sample sample = new Sample()

    // Check that method will return within 1 second.
    void 'timeout will not happen'() {
        expect:
        sample.run(500) == 'Awake after 500 ms.'
    }

    // Method will fail, because it doesn't return in 1 second.
    void 'method under test should return in 1 second'() {
        expect:
        sample.run(1500) == 'Awake after 1500 ms.'
    }

    // We can change the timeout value and 
    // the unit. The unit type is 
    // java.util.concurrent.TimeUnit.
    @Timeout(value = 200, unit = MILLISECONDS)
    void 'method under test should return in 200 ms'() {
        expect:
        sample.run(100) == 'Awake after 100 ms.'
    }

    // Method will fail.
    @Timeout(value = 100, unit = MILLISECONDS)
    void 'method under test should return in 100 ms'() {
        expect:
        sample.run(200) == 'Awake after 200 ms.'
    }

}

// Simple class for testing.
class Sample {
    /**
     * Run method and sleep for specified timeout value.
     *
     * @param timeout Sleep number of milliseconds specified
     *                by the timeout argument.
     * @return String value with simple message.
     */
    String run(final Long timeout) {
        sleep(timeout)
        "Awake after $timeout ms."
    }
}

Written with Spock 1.0-groovy-2.4.

April 10, 2017

Spocklight: Ignoring Other Feature Methods Using @IgnoreRest

To ignore feature methods in our Spock specification we can use the annotation @Ignore. Any feature method or specification with this annotation is not invoked when we run a specification. With the annotation @IgnoreRest we indicate that feature methods that do not have this annotation must be ignored. So any method with the annotation is invoked, but the ones without aren't. This annotation can only be applied to methods and not to a specification class.

In the next example we have a specification with two feature methods that will be executed and one that is ignored:

@Grab('org.spockframework:spock-core:1.0-groovy-2.4')
import spock.lang.Specification
import spock.lang.IgnoreRest
import spock.lang.Subject

class SampleSpec extends Specification {

    @Subject
    private final underTest = new Sample()

    @IgnoreRest
    void 'run this spec'() {
        expect:
        underTest.message('Asciidoctor') == 'Asciidoctor is awesome.'
    }
    
    @IgnoreRest
    void 'run this spec also'() {
        expect:
        underTest.message('Groovy') == 'Groovy is awesome.'
    }

    void 'ignore this spec'() {
        expect:
        underTest.message('Word') == 'Word is awesome'
    }
    
}

class Sample {
    String message(String tool) {
        println "Getting message for $tool"
        "$tool is awesome."
    }
}

We can run this specification directly from the command line:

$ groovy SampleSpec.groovy
Getting message for Asciidoctor
Getting message for Groovy
JUnit 4 Runner, Tests: 2, Failures: 0, Time: 54
$

Written with Spock 1.0 and Groovy 2.4.10.

April 6, 2017

Ratpacked: Conditionally Map Or Flatmap A Promise

When we want to transform a Promise value we can use the map and flatMap methods. There are also variants to this methods that will only transform a value when a given predicate is true: mapIf and flatMapIf. We provide a predicate and function to the methods. If the predicate is true the function is invoked, otherwise the function is not invoked and the promised value is returned as is.

In the following example we have two methods that use the mapIf and flatMapIf methods of the Promise class:

// File: src/main/java/mrhaki/ratpack/NumberService.java
package mrhaki.ratpack;

import ratpack.exec.Promise;

public class NumberService {

    public Promise<Integer> multiplyEven(final Integer value) {
        return Promise.value(value)
                      .mapIf(number -> number % 2 == 0, number -> number * number);
    }
    
    public Promise<Integer> multiplyTens(final Integer value) {
        return Promise.value(value)
                      .flatMapIf(number -> number % 10 == 0, number -> multiplyEven(number));
    }
    
}

Now we take a look at the following specification to see the result of the methods with different input arguments:

// File: src/test/groovy/mrhaki/ratpack/NumberServiceSpec.groovy
package mrhaki.ratpack

import ratpack.test.exec.ExecHarness
import spock.lang.Specification
import spock.lang.Subject

class NumberServiceSpec extends Specification {

    @Subject
    private final numberService = new NumberService()

    void 'even numbers must be transformed with mapIf'() {
        when:
        final result = ExecHarness.yieldSingle {
            numberService.multiplyEven(startValue)
        }

        then:
        result.value == expected

        where:
        startValue || expected
        1          || 1
        2          || 4
        3          || 3
        4          || 16
    }

    void 'ten-th numbers must be transformed with flatMapIf'() {
        when:
        final result = ExecHarness.yieldSingle {
            numberService.multiplyTens(startValue)
        }

        then:
        result.value == expected

        where:
        startValue || expected
        1          || 1
        10         || 100
        2          || 2
        20         || 400
    }
}

Written with Ratpack 1.4.5.