By Naoshi Yamauchi
Chief Performance Officer, Brooks Bell
In a perfect world, testing would be as simple as pushing a button and waiting for the results to roll in. But the world of testing is far from perfect—and getting an A/B test up and running can be a challenging, frustrating proposition. When planning a new testing program, it helps to have a sense of what you’re getting into. Here are eight hurdles to prepare for:
1. The Budget Barricade
Without a doubt, you will need a budget for a testing tool. While some free or cheap options do exist, the costs associated with creating test ideas, developing tests, and analyzing data must also be considered.
2. The Permission Palisade
Just because you and your team have realized testing is a great opportunity doesn’t mean everyone in your company is on board. Getting permission to test, especially on high traffic pages and around conversion points, can take some serious convincing. If you’re struggling to vault the Permission Palisade, consider starting small and targeting elements that have a good chance of winning.
3. The Organizational Obstruction
Testing requires effort and that means work. Work for you, work for your team, work for those supporting your team. When it comes time to ramp up a new testing program, make sure everyone knows the role they will play and what will be expected of them. Project managers can help with this.
4. The Technical Trench
When you want a new feature added to your site, do you get enthusiastic support? Or do you hear something about code freezes, timetables, and sprints? Just setting up a testing tool takes some work and if you don’t have the IT resources available to make it happen, the whole endeavor may be impossible.
5. The Setup Snag
If your tool is implemented and all test assets are delivered, it’s time to launch a test. But no testing tool available can do this on it’s own. Someone has to create and manage the campaigns. Typically, this responsibility falls on an analyst or—and this is often the case in smaller and younger programs—a marketer. If that person ends up being you, would you be ready?
6. The Creative Clog
The most obvious and effective optimizations are often creative. Changes to design, copy, and images require less coding and can lead to some serious wins. But without the help of a designer these variations can be very difficult to develop.
7. The Development Ditch
Once the tool is implemented, the development work does not end. Even user-friendly tools require some coding to run custom tests or variations with substantial page changes. If you are not comfortable with this kind of work yourself, ensure someone on your team or in your organization can handle it.
8. The Analytics Albatross
The test launched! The data is collected! What does it mean? Testing needs analysis to be meaningful. And teasing insights from test data is not trivial—like everything else on this list, it requires specialized knowledge and a certain degree of experience.
Getting tests out the door, especially advanced tests, is no walk in the park. Make sure you’re ready to clear the common hurdles before starting.
About the Author
Naoshi Yamauchi is Chief Performance Officer at Brooks Bell. He joined Brooks Bell in 2009 and has been an advocate for data-based decision making, testing, and analysis ever since.
As Vice President of Optimization and Analytics, he leads testing strategy, guiding campaigns that span marketing, creative, and analytics teams. He also directs the optimization of internal testing methodologies and processes.
———————————————————————————————————–
See Brooks Bell Live!
Brooks Bell, CEO of Brooks Bell, is speaking at Conversion Conference West 2014 San Francisco, March 17-19. Check out her session on “5 Reasons to Say “No” to a Test Idea.” Follow Brooks on Twitter and ask for a promo code to save more on your pass.