Throughout my career as a software engineer I’ve always valued the relationship between the engineering team and the customer, the ultimate consumers of our work. In a way it reminds me of the relationship craftsmen such as blacksmiths had with their customers back in days of yore, when they would be constant communication with their customer, receiving regular visits to discuss progress and maybe suggest alterations as the tool took shape.

Over the last twenty years, the agile software movement and extreme programming in particular have championed this way of working, with short iterations, showcases and customer participation in feature generation.

Extreme programming popularised the idea of creating a suite of acceptance tests in advance of implementing a feature. This was a way of giving the engineers a sense of when the product was ready to be launched and formalised the customer and engineer collaboration.

As engineers we enjoy automation, as it’s the very reason many of us started on this career path in the first place. This lead to many automation tools springing up to facilitate these processes. Why go through the long, laborious process of manually checking the software still works as intended when a change is made when you can get a computer to check it for you?

At MarketInvoice, when we were building our Business Loans product the engineering team collaborated closely with subject matter experts, producing a spreadsheet of examples of what the repayment schedule for a loan would be. We took these examples and turned them into a suite of tests with NUnit that were run as part of our build pipeline.

While tests written with NUnit (and similar frameworks) help prove the correctness of the system, they tend to be difficult for the customer to understand (even those with some engineering experience). Is there a way that we can continue this dialogue between engineering and the rest of the business (including customers) beyond the initial collaboration? Can we write tests in such a way that they are understandable to non-engineers?

Well, it turns out we can. Over the years several engineering teams started to look for ways to bridge the gap between themselves and the customers. What they wanted was a way to take the specifications the customer was writing and get them to execute the software and check it was behaving correctly (and indeed stay behaving correctly as the software evolved).

Ward Cunningham created FIT, the Framework for Integrated Test. He wanted a tool that would allow customers to use software they were familiar with, such as Microsoft Office, to give software engineers examples of how the system should behave. FIT would check those examples against the actual system, bridging the gap between the subject matter experts and the engineering teams. FIT does this by parsing HTML tables (easily generated in programs like Word) which are interpreted by a ‘fixture’ that the engineering team have written. This ‘fixture’ executes the actual code used to build the system and compares the behaviour to what is expected from the table.

For example, let’s say we are building a calculator that will perform division for us:


The fixture for this example would look something like this:

public class Division : ColumnFixture
  public double Numerator;
  public double Denominator;
  public double Quotient()
    return new Calculator().Divide(Numerator/Denominator);

Of course, maintenance of these fixtures became something of an issue for teams as the amount of tests increased. Many teams solved this by introducing a Domain Specific Language, essentially expressing the behaviour through an agreed set of natural language constructs. At the heart of this was what is known as ubiquitous language, an agreed set of definitions that engineers and customers use to describe the system.

This eventually lead to the birth of Behaviour Driven Development (BDD). BDD extended the principles of Test Driven Development by embracing the idea of expressing what the system should do based on natural language. It took the idea of ubiquitous language one step further suggesting a framework to express testing scenarios: ‘Given a context when something happens, then some verifiable outcomes occur’.

An early pioneer of this, Dan North, created a Java tool, JBehave, to write tests in this format. JBehave eventually lead to the most well known of the BDD tools, Cucumber.

What Cucumber did was take the Given-When-Then format and expanded it to include a way of organising the living documentation as features, scenarios, outlines and examples and called this language Gherkin.

If we take the above example and express it in Gherkin we may end up with something like this:

Feature: As an accountant I want to divide two numbers

Scenario Outline: Divide Two Numbers
Given I have powered calculator on
When I enter <Numerator> into the calculator
And I enter <Denominator> into the calculator
And I press divide
Then the result should be <Quotient> on the screen

|10    |2  |5     |
|12.6  |3  |4.2   |
|100  |4  |25    |

Cucumber will parse this document and look for step definitions that will tell Cucumber how to translate the text into actions performed against the system. The step definitions are defined using regular expressions and Cucumber will look for matches based on the Gherkin step. For the above feature, the step definitions will look something like the following snippet of ruby:

Given /^I have powered the calculator on$/ do
  @calculator =

When /^I enter \$(\d+) into the calculator$/ do |number|

Then /^the result should be \$(\d+)$/ do |result|
  @calculator.divide.should eq(result)

Such is the current popularity of Cucumber, the original ruby implementation has been ported to other languages such as Java, Javascript and C# (under the guise of Specflow).

These tools prove to be particularly useful in a FinTech like MarketInvoice where calculations are vitally important. Working closely with financial experts within the company enable engineers to build up a suite of concrete examples with automate verification. Tools like these help the engineering team work closely with the rest of the business, ensuring the continued quality of the product as we move forward.

At MarketInvoice, automated verification like this will be vital as we scale the system. We will need early feedback if calculations are incorrect and confidence that what we are about to release doesn’t break any of existing system.

Further Reading