Tuesday, December 28, 2010

Test coverage: jaCoCo vs Clover2

Recently I've been doing some research in test coverage tools. In my company we've been using EMMA coverage metrics for a long time. It works well, it's reports are clean and clear, and it has very small impact on a build process. The main shortage of this library is that it does not support total coverage stats for multi-module projects. Furthermore it does not provide support for integration / functional / manual test coverage.

I was looking for a tool which easily integrates with Maven, provides support for multi-module projects and - what was most important: measures IT coverage. I evaluated two products: commercial Clover created by Atlassian, and a free jaCoCo library from EclEmma team .

A few findings after Clover evaluation: 

- As this coverage library uses source code instrumentation, it's configuration has a very strong influence on project build process: it changes lifecycle of the build i.e. forks it (see: http://bit.ly/hu1RoO), so the build takes twice as long.

- Functional testing requires deployment of instrumented (clovered) web archives with instrumentation information into test container. Everything works perfectly, when using supported (i.e. described on atlassian confluence) maven infrastructure: maven-cargo-plugin, tomcat etc. When using other plugins for packaging, deployment etc. it is not that obvious to configure the process. But that's my subjective opinion.

- Support for Clover evaluators is rather poor. It's forum isn't working (NullPointerException is thrown), confluence is not perfectly up-to-date, interesting topics are scattered throught various pages (i.e. not aggregated into one root).

- Clover reports are great: love its user friendly and eye-blowing stats. It shows the main risks of a project, shows total hits for every line etc. That's a great feature, indeed.

jaCoCo library is a different story ... 

- it's 100% non intrusive for build process: no lifecycle forking, no complicated build modifications are required

- as jaCoCo uses on-the-fly bytecode instrumentation it's usage is very flexible, i.e.: on different test containers, directly on a production-ready web archive (no clovering needed). The only thing that needs to be done it to configure integration test launchers (for example maven-failsafe-plugin) with jvm argline pointing to jacoco javaagent and jacoco report dump file path. Don't forget to pass the same line to JVM, which launches test container.

- it works just fine with multimodule projects

and now the best part: jaCoCo SONAR plugin:

- It integrates with SONAR via sonar jaCoCo plugin (http://bit.ly/hLo2ti), which has a great user support and wide community.

- it's reporting is very clear and easy to configure. It offers separate unit test coverage stats, and also stats of coverage by integration tests (in our case most important thing: automated by Selenium and manual).

- as reporting part is managed by Sonar engine you can easily use Clover or cobertura libraries for reporting coverage by unit tests, whereas jaCoCo can measure only functional part.

In a current release of jacoco-sonar-plugin there is no such widget as total coverage stats, but it's already prepared for next release of this plugin, and the patch is provided (see JIRA issue here: http://bit.ly/iehK22). It contains an updated jacoco sonar sensor implementation with algorithm which merges hits from unit and integration tests. Unfortunately this issue is blocked by related SONAR API issue (see: http://bit.ly/hFhlK4), as the stats for integration tests are persisted in database for each resource, but they can't be accessed from jacoco sonar decorators. Hopefully it will be fixed soon :)

To sum up: I highly recommend using jaCoCo library and SONAR for measuring how well is your code base covered by tests, both unit and functional, as it fulfills all coverage reporting needs. Big thanks to SONAR and EclEmma teams for developing such a great tools.

For more information about sonar-jacoco-plugin go here:
http://www.sonarsource.org/measure-code-coverage-by-integration-tests-with-sonar/

Tuesday, December 14, 2010

Google App Engine project setup - the easiest way

Lately I've been struggling with a configuration of development environment for my app engine project: setting up all the necessary libraries, their configuration etc. Even with a wonderful support for GAE projects in recently released IntelliJ IDEA 10.0 some activities were really daunting. The main reason for that was a lack of default maven2 support in a GAE project lifecycle. There was even a discussion http://bit.ly/1U94vD on the project page, but it was recently closed as fixed without adding "native" maven support, sadly.

Yesterday I found a solution for quick kickstartinging app engine projects: it's called jappstart (code.google.com/p/jappstart/). Default distribution is a fully pre-configured maven template project, with latest spring, spring security, and even jackson json mapper as dependencies: everything here works like a charm, the search is over ;-)

Friday, December 10, 2010

when bytecode analysis is too much

There are many tools for static code analysis available. Recently I started using PMD and Findbugs. The main difference between these two libraries is that one of them (Findbugs) operates on bytecode produced by javac, whereas PMD works on abstract syntax tree of java source files, without the need of their compilation.

Today I faced a problem, which allowed me to understand static code analysis and a way java compiler works a little bit more. 

Reporting tools used in my company declacred many violations of our home-grown Findbugs rule. In brief: its purpose was to recognize all the empty catch clauses. As this rule behaviour was very well unit tested at an angle of various scenarios, it was the most violated rule. What was even more strange: it reported violations not only on empty catch blocks as it was supposed to, but also on perfectly legal switch statements, which used enums as arguments. Here is an example:

public void testIt(DummyEnum anEnum) {
        switch (anEnum) {
            case ONE:
                // do sth
                break;
            case TWO:
                // do sth else
                break;
            default:
                throw IllegalStateException("dummy exception");
        }
}

Usage of javap and java disassembler revealed the cause of the problem. It turned out, that behind the scenes enum types are internally mapped to static arrays. Switch clauses, which use enums are in turn translated into series of try-catch (NoSutchFieldError) blocks - empty catch blocks. Analysis of exception tables provided by javap clearly shows where to goto when an exception occurs inside switch.

public void testIt(DummyEnum anEnum) {
        switch (EnclosingClass.$SwitchMap$DummyEnum[this.anEnum.ordinal()]) {
            case ONE:
                // do sth
                break;
            case TWO:
                // do sth else
                break;
            default:
                throw IllegalStateException("dummy exception");
        }
}
As the PMD library does not operate on bytecode, it was much easier to rewrite this rule (using XPath expression) rather than patch existing Findbugs rule. In this case usage of Findbugs appeared to be exaggerated.