A/B tests: Cut the fluff and spend the pixels on design elements of interest to users

Using alternative versions of your Web pages can improve engagement and increase e-commerce sales. Here's how to get started.


A/B testing is like many things that can be vexing about the Web: a simple concept can turn into a complex programming project. But while the idea is simple -- producing two (or more) different web pages for your site and instrument them to see which one drives more traffic or more sales – getting it to work can be fraught with politics and the actual implementation details. Why bother? Mainly because there is almost nothing else that you can do that can have such a big effect. Just by changing the text size or button color you can generate a 50% increase in clickthrough rates, which is what one of Denmark airports found with their website when they swapped out text that said "Shop Online" to "Buy Tax Free." A/B testing isn't new: magazine publishers have experimented with various cover layouts and images to determine what sells best on newsstands. But it still may be new to your team or management, and in fact, many modern web sites haven't learned from what works and what users respond to. According to a new report by usability expert Jakob Nielsen, 52% of home page screen space is completely squandered, filling up the page with blank spaces or on things users ignore: filler, self-promos and ads. Nielsen compared the top business websites of today with those that he examined more than a decade ago, and found that most of them are still giving content and navigational elements short shrift. He recommends to "cut the fluff and spend the pixels on design elements of interest to users — mainly content, but also navigation." We've put together some basic tips, an accompanying slideshow to highlight what's worked for major ecommerce retailers and some suggestions on ways to attack your first testing project.

First, find a champion Your first challenge is all about politics, and who owns which portion of your website that you are trying to improve. "Sometimes a chief marketing officer loses track of what their customers are buying, and doing A/B tests reconnects customers with the company," said Brooks Bell, head of a testing firm that bears her name. (I met her at a trade show earlier in November.) But instead of picking the CMO or another executive, it might be better in the long term to find someone at the bottom of the food chain looking to shine. One of Bell's earliest projects was such an individual who didn't have any staff or much budget for testing. As a result of what they found and the bottom-line benefits they produced, "the manager now has a staff of ten and is training other groups within the corporation to do other testing, and was promoted to a director level position that is now very influential and visible within the organization," she said. You must be careful, however, that your design is far enough along where the testing can be useful. User experience expert Danielle Cooley says, "A/B testing is great when you're trying to determine if a specific detail should be one way or the other, such as should a call to action button be orange or green? But it doesn't help much early in the design process, or that you still need to have a sensible workflow and solid page design." Once you have found your testing champion, your next challenge is purely technical: what kind of testing platform are you going to use and whether you choose someone in-house or a consultant such as Bell, Wingify, Monetate or the dozens of others who ply this trade. You might want to designate a couple of your technical staff and give them a week and the necessary training to get up to speed on the topic, and let them find the right tool for their work. Keep in mind that you don't just want a coder: you want to build a longer-lasting team to conduct and interpret the tests. "You need lots of talented people to be able to ask the right questions and interpret the insights from the data that you collect, [along with] skilled analysts and marketing people to fully realize your changes," said Bell. "We typically use a war room where we have all the stakeholders present and brainstorm on the best strategies." But if you do look outside, most of the consultants have pretty convincing stories about the amount of money they have saved clients over the years or the power of their approaches, so it can be hard to evaluate them. Finally, you have to keep it going. Testing is a process, not a destination. One part of the challenge with testing is being able to scale your testing effort up as word gets out around the company about your results. Another is that the best IT shops know that they must frequently make changes to their sites to stay fresh and appeal to new eyeballs. No one is better at this than Intuit, makers of QuickBooks. When visiting Intuit's offices, Matthew Heusser, Managing Principal for Excelon Development saw how the company rolled out two interface designs and gathered statistics for one month to decide on which one to deploy. These tests have helped Intuit weather numerous changes over the years to keep their product fresh. "The amazing thing about Intuit is that it has already survived three major market shifts, from DOS to Windows, Windows to the Web, and now the conversion to social/mobile products," said Heusser. That is a lot of history. Here are some final words of wisdom from Bell: "There is always something that an A/B test misses, even with the best of intentions and careful planning. So if you get stuck, think about redoing your tests with a different segmentation of your audience, or with a different focus. There is no single best testing option. Just because you can measure something in a test doesn't mean you should do it. Think more strategically and long-term, and make sure your executives ultimately support what tests you are doing."

ITWorld DealPost: The best in tech deals and discounts.
You Might Like