What I’ve learned over ten years of A/B testing

File:Animal testing 5.jpgThere are a ton of tools that enable A/B and multivariate testing these days. Optimost, Optimizely, SiteSpect, Monetate, Adobe Digital Experience Manager, Web Trends, Google Experiments (formerly Google Website Optimizer) and more. There are generally speaking two classes of A/B testing software:

  • Client side testing tools
    • Optimizely, Monetate and others fall into this category.
  • Server side testing tools
    • Sitespect is the only purely server side testing tool I’m aware of.
  • Hybrid tools that can be used client and or server side
    • Web Trends, Google Experiments

The first time I heard of A/B testing was working at Quinstreet. Being new to digital marketing in the way they approached things, beyond just SEO, paid search, I learned a ton there back in 2002 – 2003. Being the first internal marketing manager on paid and free search was a blast. However, I ended up designing a lot of the websites I promoted myself, which was no fun (aside from the obvious – I love good design, but it’s not in my skill set to produce it).

So after a while, I wanted the sites to get redone. Traffic was flowing, the leads were pouring in…however, because Quinstreet had a testing oriented culture (very, very rare back in those days) they wanted to validate before pouring design resources against something that might have a negative return. However, that was a roadblock as my sites were all hosted externally and the servers were not capable of running java. They were straight laced apache web server, without an extra application server layer to be found.

What was a novice internet marketer to do, when he knew his conversion rates were lower than they should be due to the design and experience quality on the site? Testing to the rescue. I partnered with the engineering team and learned the basic mechanics of A/B testing. Then after about a week, I had a Perl script (yes, I know – remember, PHP wasn’t so popular back then) running which would, depending on cookie, serve either the, “A,” or, “B,” version of the site. After getting everything up and running, we started the test…guess what the results were?

30% higher conversion rate. Same traffic sources, slightly lower CTR on the primary call to action but a massive win in conversion rate. The call to action was a link to an affiliate, lead capture form so the experience of jumping from one site to the next had a big impact on conversion. See, when you have a high quality experience on a site and then move to another site of similarly quality experience, things make sense. But if you jump from a bad site to a good site, you notice the impact and it’s jarring.

Interestingly, when I arrived at Yahoo in 2004, the “launch, learn and tune,” culture was still being rolled out across the company. They had two testing platforms while I was there (guessing they have one now) and many properties had yet to fully embrace testing to improve conversion rates.

So what did I learn from all this testing? A few highlights:

  • User experience, even without a single word of copy changed, can have a massive impact on conversion
  • Sometimes when things are important, you have to roll up your sleeves to get the job done
  • Test what people notice – ignore things that people won’t pay attention to

Years later, I was involved with testing at another company. We did several distinct tests, each one to validate that footer changes wouldn’t have an impact on conversion. The results? Go nuts, do whatever you want in the footer. A prospect who’s in, “Buying mode,” won’t scroll to the bottom of the page and investigate their options. Only somebody in desperate need of help will do that, a journalist or somebody out shopping the competition.

I have a lot more testing experience from my social network, business consulting and of course, Intuit. Their culture was all about A/B/C and multivariate testing, which was fantastic. Some of the insights the team put together which drove big wins were very clever and had nothing to do with design, only prospect insights. Small changes in verbiage, based on a thorough understanding of the problem, can sometimes pay off in a huge way.