A/B testing: A peek behind our marketing curtain

Jeff Alford, SAS Insights editor

Are writing and design art or science? For marketing writers, editors and designers at SAS, the answer is: both. The art of writing is creativity, tone, voice, style, word choice – the intangibles that distinguish one writer from another. The scientific, technical parts of what we do tend to be the “rules” of grammar, good writing and design, which we continually debate among ourselves (for example, is “cancelled” a misspelling or an alternate spelling?).

Whether we admit it or not, sometimes (maybe often) we wonder if we’re making the best editing and design choices. This is especially true when it comes to the web. Will moving this piece to this location make more readers stay on the page longer? Will using this phrase versus another one attract more search traffic? Sure, there are tools that can and do help figure these things out. But in the long run, how do you “know” you’re making the best choice? Think of it like a jury trial, sometimes they get it right and sometimes they don’t . . . but you almost never know for sure.

But there is a way to know with greater accuracy: A/B testing.

Simply put, A/B testing compares two options in an empirical way to determine which is more effective. The goal is an improved customer experience.

The science of A/B testing

Applying science to writing and design

Here at SAS, we make extensive use of A/B testing for a wide range of activities from GUI development to designing collateral templates.

At the Insights editorial team meeting a few months ago, I casually mentioned that I thought I had found a way to create a more prominent call to action in our articles.

We had been using links in the right-hand sidebar under a “read more” heading to prompt our readers to take the next step in finding out more about a topic or a SAS® solution.

That worked, but we believed that if we moved that call to action to a more visible spot we would get a better response. We experimented with a few ideas, but weren’t entirely satisfied.

Sometimes it takes a polarizing choice to move ideas along and that’s what happened in this case. I showed the team what I’ll refer to as the “big orange button” (BOB). And, it was exactly that. Big, bright, unattractive and stuck right in your face like a thumb in your eye. You could not ignore it, but would people click it?

Here’s what the button looks like. This is the call to action for this article. It will take you to a white paper that discusses A/B testing in more detail and offers ideas on how you can make better use of A/B testing. Finish reading this article first (please), then click the button (go ahead, you know you want to).

A/B testing white paper

The team immediately fell into two camps – those who hated it and those who didn’t hate it quite as much. Fortunately, our director suggested that we perform an A/B test. We worked with SAS’ crack A/B web experts to develop the test plan. They chose the article where I first used the button as the test page. In A/B speak, the traditional right-hand sidebar location was the “champion” (A version) and the BOB was the “challenger” (B version).

Champion vs challenger: The results

No matter where readers were coming from (via search, social media, the Insights index page or the Insights newsletter), half of them saw the champion (A) and the other half saw the challenger (B). The test was designed to measure how often readers clicked on the call to action in either A or B.

The test continued until a predetermined threshold of views and call-to-action click-thrus and white paper downloads (or conversions in marketing speak) were reached. In our case, the test lasted roughly six weeks. The tester then prepared a report that included the test methodology, screen shots of the two variants, the sample size and the results.

Here are the findings:

Version B (with the button in the body) resulted in a 193 percent increase in conversions. In addition, it resulted in a 213 percent increase in clicks and an 89 percent increase in overall page engagement. This test reached a 99 percent confidence level.

Key metrics (A vs. B):

  1. Conversions: 5.46% vs. 15.99%
  2. Clicks: 8.53% vs. 26.74%
  3. Engagement: 23.55% vs. 44.48%


The BOB was the decisive winner.

The challenger unseated the champion in this case, but does that mean it’s the best of all possible options? No, only further testing with new challengers will determine that. But for now, we can say with some certainty that the BOB works better than what we were doing before. And, as we learn, we often test repeatedly and use different scenarios to make sure we only implement the best practices.

Be forewarned, sometimes the champion will stay on top and sometimes there may be no clear winner. What’s important is making sure that your test is designed well so that other factors don’t muddy your results. Going back to the court analogy I used earlier, that would be equivalent to grounds for a mistrial.

A/B testing and the customer experience

Read More

Learn more about how digital marketing can enhance the promotion of your products or brand.

Get More Insights


Want more Insights from SAS? Subscribe to our Insights newsletter. Or check back often to get more insights on the topics you care about, including analytics, big data, data management, marketing, and risk & fraud.

Back to Top