This may or may not come as surprise, but I am never satisfied with things if they are static. I feel like most of you out there can agree with that sentiment.
Recently, I wanted to shake up our website traffic and see if I could generate more conversions. I designed and ran an A/B test. I let it run for 30 days. I would check every few days to see who was winning, A or B. You know what happened at the end of 30 days?
The test ran, it ran correctly. But the results were inconclusive. Each option, both A and B were equally engaging (or not engaging) to website visitors. Which means I don’t have the answer I was looking for. Where do you go from here? What happens when your A/B test doesn’t give you the answer?
The good news is you always have options.
- Re-run the test
- Run a new test
- Run a test that isn’t A/B testing
- Leave things as they are
Let’s unpack these, shall we?
Re-run the test
You can re-run the test you just ran to see if you get different results. Why should you consider this? You may have a seasonality to your site (or ads, or email) that wasn’t factored in. For example, I let my A/B test run for the month of July. Historically, that’s one of our slower months. Had I realized that earlier in the year I probably would have slated a different time frame. Take a look at your data before deciding the timing of an A/B test. That might make a difference.
Run a new test
This one seems pretty obvious. If the first test was inconclusive, run a new test. You can design something completely different than the first one, or use the same control and test a new experiment. For example, I was testing part of our home page to see if changing the content would result in more conversions. When I design a new test, I’ll use that same section of the homepage but with different content, with the same goal of more conversions.
Run a test that isn’t A/B testing
This is more along the lines of market research, so not really a test at all. Instead of a programmatic A/B test using software like Google Optimize, ask your customers and community what they want. These are people you already know and who know you, versus anonymous people visiting your site. You can show them the existing asset and then give them different options. Ok, it’s an A/B test. I almost had you though.
Leave things as they are
You know what? You don’t have to change anything. If you run an A/B test and the results are inconclusive, you don’t necessarily need to start making other changes. Maybe it was the wrong test and the wrong time. Give it a minute. Breathe a little. Go back to your data and see if there is something different you should be looking at. Maybe it’s not the asset, but the audience or the offer that’s wrong.
At the end of the day, more testing is always a good idea. Test early, test often. Behaviors change, platforms change, and solutions change.
Are you A/B testing your marketing? Tell me about it in our Free Slack Group, Analytics for Marketers
Need help with your marketing data and analytics?
You might also enjoy:
Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!
Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new 10-minute or less episodes every week.