Remember the scientific method? You form your hypothesis, test it out and then draw conclusions based on results. You may remember that fun science teacher who assigned experiments to determine which fertilizer would grow taller bean plants or if an object can survive a drop down the second-floor staircase (what was that supposed to prove again?).
It’s been a while since most of us at Raka were in a science class, but we’re using a variation on the scientific method all the time. A/B testing, or split testing, is a way of measuring the effectiveness of your inbound marketing efforts, like lead generation, web user experience, social media, advertising efforts and much more. By creating two different landing pages, for example, that have different messages or designs but are targeted at the same audience, you can test which one works better by looking at which one converted more leads.
In this post, we’ll dive deeper into A/B testing, look at some A/B testing examples that worked, and talk about how you can make it work for your company.
All effective inbound marketing A/B testing experiments have this in common: choosing a “variable” and testing it against a “control.” In a scientific experiment, the control is the baseline you test against. In A/B testing, the “control,” is the way you’ve been doing things, as in your current home page, email campaign, or landing page design. The “variation” is a different version you’re testing to see if it gets different results.
For example, your company wants to improve the conversion rate of its email campaigns. People are opening the emails, but are not clicking through to buy. One A/B test you could run is changing the email’s headline. Another test might be changing the header image. Another still might be including a banner with a coupon code. Or you could simplify the design. There are endless variables you could try.
When you start A/B testing, it helps to begin with one variable at a time to really home in on what’s working and what’s not. If you start your A/B testing with a full homepage redesign, you will never know if the difference maker was the image, design, new CTAs, or messaging.
A good example of this is the below ad from the gaming company EA, promoting the newest version of SimCity.
Control
The company had hoped to encourage pre-orders by offering $20 off a future order, but the promo was not getting the results EA had hoped. When it was removed (below), purchases surprisingly increased 43.4%. The banner was distracting people from what they came there for—to purchase SimCity.
Variation
Once you’ve decided what you want to test, the next step is developing a hypothesis. If you remember from science class, a hypothesis is an educated guess, and to make one, you need information you already have about your digital marketing assets, company, and customers.
A good example of this is the wall decal company WallMonkeys. This online retailer wanted to boost conversions on its home page, so it began testing images and other tools. Here’s a screenshot of part of the home page, which has a CTA banner across an image featuring one of its murals.
After doing some research on the existing site, WallMonkeys learned the majority of users were clicking on the search bar at the top of the page, not the CTA. WallMonkeys decided to switch out the image to draw the eye downpage and conversions increased by 27%. Next, the company kept the same image but removed the CTA and added a branded search bar (see below). The result? A 550% increase in its conversion rate. Not too shabby!
WallMonkeys owes some of its success to its ability to decipher its own site data. By seeing that visitors were searching more than shopping, the company was able to develop a more informed hypothesis faster.
Variant
Many applications offer A/B testing tools as a product feature. MailChimp, Unbounce, Google Ads, and HubSpot are just a few of the platforms that make it easy for users to test messages, images, and more.
Let’s take a look at setting up A/B testing for an email campaign in HubSpot. After creating one version of a marketing email, you’ll create a second version to test how a single variation impacts email opens or clicks. HubSpot sends out both to different segments of your selected email list and depending on how well each version of your email does, HubSpot can pick the best-performing version to send to the rest of your list.
First step is to set a goal, like more clicks. Next, develop a hypothesis based on what you know about your company and customers. If a discount code increased conversions for one target audience, maybe it will have the same effect on another? Once you’ve landed on a hypothesis you’ll have the variation you’ll need when setting up the email campaign. The variation can include:
Once you have your control and variation email created (A and B), HubSpot lets you configure your testing options (see below), which include:
Once the emails have been sent, HubSpot will give your results to analyze whether your hypothesis was correct. Here we can see version A did best, as well as how many people got it, how many opened it, and how many people engaged with the email.
While this example is focused on email marketing, many of these steps and variables are similar to those in other A/B tests, like length of time, distribution, and which aspect to test.
Now comes the important part. Analyzing the results. Whether it’s a landing page to generate leads or an Instagram ad to boost sales, you will need to gather and interpret the data. Take the example above. It’s clear version A performed better. Was that the “control” email? If so it looks like whatever hypothesis was made was incorrect. If that happens, don’t look at it as a failure. You know more about your audience and their preferences than you did before.
If version A was the “variant,” your hypothesis was correct and you can adjust campaigns accordingly. You can also keep testing, making the variant email the control and coming up with another variable and hypothesis about how engagement can be improved. People’s tastes change, and so will what they respond to, so executing A/B testing from time to time is a good way to improve conversions.
Sometimes results surprise us. Who would have thought removing a discount offer from a webpage would INCREASE conversions. Sometimes results don’t tell us much, either, like when there is only a .5 percentage point difference in results. When results don’t point to a clear answer, refer back to your company’s overall goals. The final goals—signing up and clicking through—are more important and carry more weight than smaller goals like email opens or social media likes.
Sometimes the hardest part of A/B testing is just getting started. As you’re planning, keep these tips in mind:
A/B testing offers companies infinite possibilities for fine tuning their marketing and messaging with very little risk. If you haven’t experimented with A/B testing yet, you now know how to do it. Why not give it a try?