AB testing is (mostly) a waste of time


19/06/2023 • In most B2B SaaS use-cases, having a fully-fledged CRO program is (probably) a waste of time.

 
 

Now I’m not saying all AB tests are a complete waste of time, but the practice of testing and tweaking everything to measure marginal differences over extended periods of time has gotten a bit out of control.

I’ve had colleagues test an orange button vs. a blue one for 6 weeks to get a 0.5% increase in click-through rate with a sample of 1M+ sessions, only to retest the same thing the next quarter and see that blue beat orange by over 1% this time. Instead of giving up, we created a new variant soon after, this time throwing pink into the mix.

In all honesty, I have little to no faith in these tests when it’s just “for the sake of” testing. I’m sure there’s “millions on the table” left behind in some eCommerce companies with 100M+ monthly users showing varying signs of buying intent and being put off by a red buy button - but for most B2B SaaS companies paying thousands from their marketing budget to test out H3 variations and CTA corner radiuses - why?

Not entering the whole conversation of CRO v. segmentation v. personalization v. individualization, because this isn’t about that, I tend to agree with VWO putting forward the concept of experience optimization. Test the whole page or flow as a dramatic variant rather than just a button - test an entire experience vs. another to get some real results.

 
 

← Back home