I have been toying with the design of my own website for some time now. It’s by no means the finished website and it started out as a pet project more than anything else. Because of that I have allowed many of my own personal tastes to influence the design, but that has not kept my from thinking about how to optimize the page.
One obvious improvement would be to upgrade my website from a wix.com address to my own domain name. For the website design itself I would ask friends and family to give me their feedback. That approach would have left me vulnerable to a large array of personal preferences and shoddy suggestions without ensuring that I was actual optimizing my website to suit my goals.
I have often thought about the possible impact the shape or color of a button, menu, link, or picture frame might have on those viewing the site. A/B Testing makes testing the impact of slight changes much easier to mange. The concept is simple. Take two things, in this case two webpages, that are almost exactly the same save for one variation (such as the color of a button) and see if there is a significant difference in a user’s behavior (such as the number of people who clicked on it).
Optimizely offers their explanation of A/B Testing, and their services, to those with their own websites. Along with providing A/B Testing Optimizely also provides the ability to run multivariate tests, though these tests do require much more traffic in order to provide results and can become difficult to manage depending on the number of variables being changed. If you have the time and the traffic it can certainly prevent you from testing multiple variations one at a time before finding out the last one was the most effective for reaching your goal.
An effective A/B Testing goal should be simple. Increasing revenue is more effective than setting a specific amount or percentage because you can’t possibly know what the impact of a change might be. If you think you know the impact that a possible change will have before testing occurs then you may be making a mistake. A common mistake is believing all A/B test results will be the same or are directly transferable. Optimizely employee Grigoriy Kogan wrote an excellent blog post on the problems with A/B Testing Success Stories.
Many of the problems, and takeaways, of Grigoriy’s blog post can be applied to the success of President Obama’s reelection campaign. It’s an A/B testing success story regardless of your politics, and Obama isn’t afraid to tell you that.
When examining the success of Obama’s campaigns it’s important to take everything with a grain of salt. In a nice summation from BusinessWeek, In the article Amelia Showalter, Obama’s director of digital analytics, was quoted as saying this:
“Every time something really ugly won, it would shock me: giant-size fonts for links, plain-text links vs. pretty ‘Donate’ buttons. Eventually we got to thinking, ‘How could we make things even less attractive?’ That’s how we arrived at the ugly yellow highlighting on the sections we wanted to draw people’s eye to.”
If someone was just to scan over this article they might think that what she is saying is make everything as ugly as possible to get the attention of your customers and users. Somehow I think most people would disagree with that assessment off of instinct alone.
As the title of another feature from TechPresient suggests, the key takeaways are to experiment and analyze. I would disagree with the title that you should analyze everything. I don’t believe even Amelia did that especially since she only had an hour to run her tests and decide what to send out on a given day.
The successes of one A/B test does not ensure that the outcomes would be the same for another business. This is even more true when trying to transfer a campaign’s discoveries to a business. Businesses persist from year to year while a campaign is fleeting. Assuming that customers/supporters recognize that fact it makes sense that they might behave differently to the same strategy being implemented by both groups.
So what about business success stories? Hubspot provided some good examples of A/B Testing successes in their blog from EA, Upworthy, and ComScore. I would encourage you to read about those successes while remembering to be critical of any of these stories. What works for one company may not work for everyone.
This assertion stands true for many reasons, but one of them is the product a business might be selling. I would not take the method that I used for selling an EA video game and apply it the same way for lingerie or vice versa. Even lingerie sales can benefit from A/B Testing. While Victoria’s Secret might utilize A/B Testing it was Adore Me that was featured in Fast Company for its testing practices.