When making decisions about the overall content and look of your website, the perfect combination of
details never comes together over night. One way clever marketers can evaluate the effectiveness of their site’s appearance is to use what is known as a “split test.” Instead of simply guessing which color, font, image, or description makes the best impression, Google Analytics tracks your audience’s reaction to subtle design changes. A split test provides valuable data on which version of your website sells the most products, generates the most downloads, or attracts the most attention. In short, it’s a surefire way to determine whether modification to your site will result in positive conversion for your business.
What is a split test, and how do I do it?
Split tests can be conducted in the form of an A/B test, which compares versions of your site against each other; you can choose to differ one variable, or conduct a multivariate test, which analyzes more complex versions of your site and its elements. Once the split test is in motion, users will be totally unaware they are taking part in your field experiment. Here’s a hypothetical example of how an A/B test works:
Set a Goal
Say your website features a bold “Subscribe” button at the bottom of the home page, but visitors aren’t signing up for emails at the rate you’d like. Your company decides to set a specific goal of garnering more readers by updating the appearance of the subscription button. You suspect that its poor placement is a factor.
Your webmaster redesigns the homepage by moving the “Subscribe” button to the top right-hand corner of the screen, hoping that visitors are more likely to spot it. Before committing to the change, Google Analytics tests both versions of your site.
Perform the Test
When customers reach your website, half will be directed to the original version “A,” and the other half to the new version “B.” The software tracks users’ habits by compiling data on people who saw the “Subscribe” button, who clicked it, and who completed the sign-up process. After enough visitors have taken part in the test, the generated data provides concrete statistical proof of whether or not the change in button location influenced users’ likeliness to subscribe.
Evaluate the Outcome
If the updated page resulted in 10% more subscribers, then your company’s suspicions about the button’s placement were correct. In effect, the A/B test confirms which design was more engaging, helping your business make the right decision. If the test shows an equal conversion rate for both versions A and B, then your company should go back to the drawing board and consider other possible reasons for lack of subscribers.
What elements of my site should I compare?
When conducting a split test, many webmasters will evaluate the following individual aspects of their site:
- The impact of wording, color, font, and size of major headings on the site
- Background patterns or colors and how appealing they are to users’ eyes
- The exact location of text or buttons on the page, including social media links and buy it now links
- Wording of your “call to action” copy
- The look of a logo
- The effectiveness of a product description
What do I do with results of the split test?
If you let the analytics software do its job, the data you’ll receive over time can help you make important decisions about seemingly insignificant aspects of your website’s design. With the A/B method of comparing versions of your site, you can test individual elements one by one, making sure to consider the outcome of each test once enough visitors have logged on. The possibilities are endless, and a smart online marketer should experiment with every last aspect of their site before deciding upon its optimal visual appearance to enhance selling power.