Behind the scenes

From gut feeling to test culture

Christian Sager
18.9.2020
Translation: machine translated
Pictures: Thomas Kunz
Co-author: Christoph Glaser

In software development, we make decisions based on data. A/B tests help us to do this. As we were not satisfied with the external testing tool, we quickly developed our own A/B testing tool.

For a good three months now, digitec and Galaxus customers have been able to shop in a climate-neutral way. This is made possible
climate compensation in the check-out. But not all customers were able to offset right from the start. We carried out an A/B test in the first two weeks. Some of the shoppers saw the familiar A variant without the CO2 compensation option in the check-out. The other group saw the previously unknown B option with the option to offset their CO2 emissions. We wanted to find out how our customers responded to the CO2 compensation feature.

Such A/B tests are now an established tool in software development to measure the effectiveness of new functionalities. According to Google, just 30% of all A/B tests are positive. This in turn means that without testing, there is a 70% chance of making a change that has a negative impact.

Quote from W.E. Deming «Without data you're just another person with an opinion.»
Quote from W.E. Deming «Without data you're just another person with an opinion.»

Simpler said than done

Even if the basic principle of A/B testing is quickly understood, the subject area harbours a great deal of complexity: starting with a solid test hypothesis, through correct set-up and tracking, to evaluation and the formulation of follow-up hypotheses. For this reason, there are countless testing tools on the market that simplify certain aspects of A/B testing for software teams or even take them off their hands completely. Until now, we at Digitec Galaxus also used an external testing tool. With increasing testing experience and an overall increasing learning curve in our engineering crew, we reached a point where the development teams had various problems with the use of the previous tool. This is where things got stuck:

  • High demands on page performance: We want to avoid testing having an impact on the loading time and therefore the user experience.
    - The tool we used previously was a black box for us: we had no detailed insight into which methods were used to generate the test results.

And we didn't want to compromise on our culture of experimentation either: We want to determine the playout ourselves in our A/B tests and offer our users the fastest possible tests.

The solution was quickly clear

Within a short space of time, we had a compact and motivated group ready to tackle the problem. The group was a colourful mix: software engineers, an analyst, UX researchers and product owners were all highly motivated to tackle the topic.

The first question we asked ourselves was what options we had:

  • Solve problems with the existing tool
    - Evaluate a new tool
    - Build our own tool
Different perspectives united in one team.
Different perspectives united in one team.

Together with two external A/B testing experts, we got to the bottom of the performance problems and carried out a small audit. We quickly realised that it wouldn't be rocket science to build our own tool according to our requirements. It would also give us full transparency and control over the tool and allow us to develop it further according to our requirements in future.

The testing tool also needs to be tested

The development of the new tool then began. However, the developers did not abandon everything, but invested a few minutes every now and then alongside their primary work. And lo and behold, within three weeks, the first prototype was ready for its first test. We then used various tests (AA tests) to validate the plausibility of the data and the test results. After we were able to put a green tick behind the validation, we started the first end customer test after a total of six weeks. We were all eager to see the initial results and whether the new tool would hold up. But to our relief, there were no nasty surprises and the first test went through without a hitch!

There was a lot of excitement before the first end-customer effective test.
There was a lot of excitement before the first end-customer effective test.

Five months have passed since then and we have already run dozens of tests using our tool. There are still challenges. But because we have full control over our tool, we can react quickly and develop the tool according to the needs of our teams. Over time, the colourful and motivated engineering crew has developed into a well-established group that continues to drive the topic of AB-Testing forward within the company.

Join the team.

Join the team

Would you also like to solve problems in an uncomplicated way? Then take a look at our job adverts:our vacancies

25 people like this article


User Avatar
User Avatar

Fast learning through small but valuable steps is the key to successful products.

These articles might also interest you

  • Behind the scenes

    From Lego to iPhones, here’s what our customers search for most

    by Manuel Wenk

  • Behind the scenes

    Our strategy for greater sustainability

  • Behind the scenes

    How a software engineer fell in love with logistics

    by Tiago Santos Baranita

Comments

Avatar