4 things I've learned from A/b testing websites

Document everything

A/B testing tools allow you to do your test and add basic information such as a title and sometimes a description. I like to go further and make sure I record assumptions and data outside the tool. This makes it easier to go back and see what was done, when, and why. It also helps clarify the main information to the rest of your team without them needing an account for the tool you are using.

In my testing documents, I include

  • When the test started
  • Hypothesis (what are we testing and why)
  • Expected results (our test should increase or decrease x)
  • Screenshots of what the tests look like
  • Actual results (after the test is over)
  • Action items of what is done after the test (implement the change you tested or go on to the next test)


You need to have enough traffic in your A/B test to provide meaningful results. Look at page traffic data in Google Analytics to determine the best pages to test that will give the most results the quickest. If you have a low traffic website or page you may need to wait longer to get meaningful results.

There are other ways to improve your site if you don't have enough traffic including Google Ads experiments, user feedback, and looking at site analytics.

There are a few online tools that can help you determine how much traffic vs how long you may need to test to receive meaningful results.

Further reading - A/B Testing Tech Note: determining sample size

No clear winner

Some a/b tests will end with no results or in a draw even after running for long enough. The important part is to take this and understand why.

Some thoughts on why the results are flat:

  • Does the test directly involve user interactions or is the change passive? (Changing button placement vs changing 1 word in a headline)
  • Are the changes bold enough to be noticed by the user?
  • Is the area you are testing effecting conversions?
  • Are there steps previous to your test in the user journey that you can test first?

Pair no results tests with other metrics (heat maps, event analytics, page recordings) to further refine where you should be testing.

Always be testing

A/B testing should be a continuous improvement initiative that is also baked into your analytics and user testing processes. Taking a look at Google Analytics is a good place to start to understand the best pages to test. Using results from a previous test is also a great way to determine the next tests that you should do.

Let me know what you think on twitter @rdallaire

← Return to blog