Conversion at what cost?

Every online business should be focused on conversion – it goes without saying. A/B testing (which I’m including multivariate testing in even though they are different) is a great way for businesses to help determine what works and what doesn’t when it comes to converting leads into customers. The recent rise in availability of A/B testing tools is helping many online businesses gain valuable insights they never could before, and to test ideas more accurately than ever. But should it be given the credence it seems to have at the moment?

The thing is, I don’t think A/B testing gives a very complete picture. A lot of people and businesses seem to see it as the be all and end all, if the data says it converts better than that’s what we will go with but I think this is short sighted. A/B testing can tell you what works for short term goals and specific interactions but we need to keep sight of other things like trust and brand perception that are much harder to measure and I’d suggest have a much greater long term impact on conversion.

To help illustrate my point let me give you three examples:

  1. The first example is changing a button label from “Download our software” to “Get started now”. I’m sure in testing that “Get started now” will be clicked on more often. But is downloading the software what they expect from clicking on this button and how do they feel about what happens?
  2. Another example is changing a newsletter subscription entry area from an in page form to a pop-up dialogue (also known as light-box). Again I’m sure the pop-up will get more conversions than the standard form on the page but what does it communicate about your brand/personality by pushing it in someones face? I equate pop-ups like this to those pushy sales people on the street who try to get your attention when you are focused on something else.
  3. A final example is having a check box in a checkout/payment form that asks customers to check the box to NOT receive email newsletters. Again this would no doubt get more subscribers than a straight forward question but what impact does it have when an unwanted email arrives in the customers Inbox?

Don’t get me wrong, I think A/B testing is extremely valuable and I use it when ever possible to help inform the designs I create, I just think it’s important to consider the findings of it in context of other factors – specifically brand personality, trust and long term business goals.

There are a wealth of dark pattern UI tricks out there that convert better than what I’d call the “right” way of doing it but clearly these won’t help in creating a valuable brand that customers want to tell their friends about. And let’s face it, word of mouth is crucial for online businesses.

As well as A/B testing it’s important to do regular user testing. User testing will help build a picture about the more important question you want answered – why. Why are customers drawn to it, why are they more likely to click on it, what are they seeking etc.

Focus on creating real value for customers and on supporting them to do what they want to do in the order they want. Be timely and contextual in the delivery of your messages and offerings. Be true to your brand values and your conversions will follow naturally. A/B testing is a valuable tool in the toolbox but it’s not THE tool. I’m certainly looking forward to when the buzz surrounding it settles.


  1. Rob

    Very well said James.

    As you know, I think the world of A/B testing but it’s critical that it’s done, as you say AS WELL AS, not INSTED OF user testing and “white hat” design.

    The way you measure success and analyse the data is also critical. Analysing site performance across a range of metrics and monitoring long term trends is crucial before you do any A/B testing.

    We’ve actually found the opposite to what you described about short term success leading to longer term pain. In a test that we recently ran, we found that the control (existing experience) initially performed better than any of the supposedly improved options we were testing. After a week though, visitors had had an opportunity to get used to the variations and a different pattern started to emerge.

    Definitely important to remember that A/B testing can’t be used to replace good UCD practices but it can definitely be used to supplement them.

  2. Eleanor

    Good one James – well written and point nicely made!

  3. Leni

    Asking folk to forgo a concrete short-term gain in order to avoid a future uncertain and unquantifiable loss is a tough sell. Climate change is an example.

    Not to say that there’s not merit in the argument (there is) but it would stand up better were it supported by data. It is possible to measure the loss of brand equity?

  4. james Author

    Thanks for your comment Leni.

    I’m not an expert in market research but it’s fair to say (as I do in my article) that it’s much harder to measure things like brand equity.

    I’d imagine a survey would be the best way to do it – to establish and monitor peoples perception of a brand. Even that is difficult though as it’s not until someone interacts with a brand for a reasonable period of time that they can begin to accurately give their views on it.

    In terms of quantifiable examples, I’d look at Google’s do no evil approach and there #1 design principle of “Focus on the user and all else will follow. The massive success they’ve had I’d put forward as a good quantifiable example.

  5. I agree.

    A/B testing; it’s dealing with a specific incident. It may not take into account the entire customer journey.

    I like your first point. It does seem that it can lead to a misinterpretation of the results. Just because they click that button; doesn’t mean it’s their true intention.

    It’d be interesting to measure satisfaction after they click. That’d be a more complete picture.

  6. Yuval Ararat

    James one thing has come to my mind when I was reading the post, service design.
    Service design is a holistic approach to the whole interaction with the brand that will start with the pitch and end with the support tickets.
    The a/b will be a singular decision making tool but will have to include user expectations and manage accordingly.
    BTW recently I find myself reading the checkboxes very throughly to make sure I get the desired response.

Leave a Reply