Wednesday, June 1, 2011

Introducing SodaTest: Spreadsheet-Driven Integration Testing for Scala and Java

I'd like to reveal what I've been working on in my spare time for the last couple of months.

The Announcement

SodaTest: "Spreadsheet-Driven Integration Testing", is an open-source framework for creating Executable Requirements for Integration and Acceptance testing in Scala and Java.

The impetus for starting this project was to attempt to create a tool that improves on Ward Cunningham's Framework for Integration Testing, "FIT". As an 'Executable Requirements' testing tool, it can also be considered as an alternative to Fitnesse, Concordion, RSpec, Cucumber, JDave, JBehave, SpecFlow, and Thoughtworks' Twist.

The Background

We used FIT in anger when I first arrived at my current workplace over four years ago. Since then I've watched the team become first dissatisfied with it, then scathing and fearing of it, before abandoning development of new FIT tests in place of Integration Tests written in JUnit. Nevertheless, we all still felt that Executable Requirements were a Good Thing and made a couple of failed restarts in trying to get back on the wagon.

From my own perspective, many of our issues (but not all) were to do with the tool. I identified two main issues that I'd seen hurt people over and over: the input format and the programming model. So I set out to create something that solved these two problems while remaining fairly close to what I think is a great foundation in FIT.

The Result

The result is a tool I've called SodaTest. It uses spreadsheets, in the form of comma-separated value (CSV) files, as the input format. The contents of the spreadsheet are basically small individual tables, much like what would appear in a FIT test. There is a required but simple structure to the tables and a minimum of special words and symbols to provide some context to the parser.

I also aimed to keep the programming model as simple as possible. I tried to make sure there is only noe sensible way things could be done, so as not to confuse developers with options. I've gone to lengths to ensure developers of Fixtures will have to do a minimum of work by having the framework do a lot of boilerplate work for them. Lastly, I've structured and named everything in a way that I believe will guide developers in the right direction (by which I mean away from mistakes that I have made in the past while writing FIT tests).

I've also taken the time to add a little sugar here and there that I thought was missing from FIT, for example integration with JUnit and a more comprehensive set of built-in coercion strategies for converting strings into strong types.

I'm quite pleased with the result. (I wouldn't be telling people about it if I wasn't!) Yesterday I released version 0.1 of SodaTest and people should be able to use this first release of SodaTest to create tests that do almost everything that FIT could do, but with less effort in creating the tests, writing the Fixtures, and getting the whole lot executing in their environment.

You can find out more about the motivation for SodaTest and the features it includes by reading the README on GitHub.

While SodaTest is written almost entirely in Scala (and the most benefit will be gained by using Scala as the language for writing Fixtures), I've also written the sodatest-api-java extension that allows SodaTest Fixtures to be written in Java. There is one limitation where Scala is (currently) still needed, but I reckon 95% of Fixtures should be able to be written entirely in Java if that is something you care about.

The Next Steps

The next steps for SodaTest are clear to me: Dogfood, Broadcast, Feedback and Evolve.

I want to convince my team at work to start experimenting with Executable Requirements again using my home-grown tool; I'd love it if other people in the Scala and Java communities could download this tool and give it a little tryout during their next iteration or two; I want to hear Feedback from people about what's good about SodaTest, what needs more work and whether there's parts that are just plain horrible; and, if the feedback is positive enough to consider SodaTest a preliminary success, I want to continue improving it in the areas where it's still holding people back.

There is already a Roadmap of possible features to add, but now it's really time to get it in people's hands and find out from users what is the next most important thing it needs to do.

Try It Out!

If you're a Scala or Java software developer and Executable Requirements are either a passion of yours or something you've been wanting to try out, why don't you give SodaTest a try? You don't have to commit to it, just write a couple of tests with it, get them running, then passing, and send me some feedback. All feedback is useful, even if you think it sucks! (As long as you tell me why.)

To get started with SodaTest, I suggest you:

<repositories>
<repository>
<id>sodatest-github-repo</id>
<name>SodaTest repository on GitHub</name>
<url>http://grahamlea.github.com/SodaTest/m2repository/ </url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>org.sodatest</groupId>
<artifactId>sodatest-api</artifactId>
<version>0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.sodatest</groupId>
<artifactId>sodatest-runtime</artifactId>
<version>0.1</version>
<scope>test</scope>
</dependency>
</dependencies>

4 comments:

  1. Hi Graham,

    I think that's a nice initiative and I hope you'll go on reporting on your experiences with that tool.

    As for myself, I've been implementing something in that space but very developer-oriented: http://etorreborre.github.com/specs2/guide/org.specs2.guide.Forms.html#Forms.

    Basically, text and fixtures are all Scala code. This appeals to me for several reasons but I'd understand if that wasn't the case for everyone!

    IMHO, a good middle-ground is Twist, why didn't you go that way?

    Eric.

    ReplyDelete
  2. Hi Eric,

    Thanks for your comments.

    One of the things I've really struggled with since we started doing most of our integration testing in JUnit is the amount of code we've written for these tests! I'm sure 75% of what we've written is the repeated (not in the same file, but in another test somewhere), but the lines between test input, test fixture and assertions are often so blurred that it's very hard to extract the fixture part for re-use. That's one of the reasons I loved FIT's approach of separating the test input and assertions out from the code entirely.

    As for Twist, there's a couple of things that made me not try it. Firstly, it seems to be very geared towards web testing, but the majority of my testing is backend-focussed. Secondly, I'm not convinced that "Express[ing] requirements directly as test specifications in English" is actually the way to go. Natural Language tests to me contain too much noise around the real data (and I think most accountants would agree, too). Many non-trivial examples that I've seen end up reverting to tables in the end anyway. Lastly, it's built on Eclipse, and I almost always code (at the speed of light) in IDEA, so I worried about how well the two would play together.

    That being said, I think they've done an AMAZING job at solving the main problem of other Natural Language test tools, which was having to augment the source form (usually HTML) with special stuff (see How It Works at http://www.concordion.org/). Building an IDE to maintain the source form for you is a knockout solution, though I imagine it's taken quite a lot of manpower to get there.

    Cheers,

    Graham.

    ReplyDelete
  3. By the way Graham, I just found out that ZiBreve was out: http://tech.dir.groups.yahoo.com/group/fitnesse/message/17812

    Eric.

    ReplyDelete
  4. Thanks, Eric. Providing tools to handle and reduce duplication in tests is very progressive, IMO. I like the idea.

    ReplyDelete