Tracking all these different variants in the Twitter Ads interface was always going to be a bit of a challenge, so to help know which video got the most clicks, we had created two bit.ly links ( one for each video) and marked the final URL up with suitable Analytics UTM tracking code indicating if the link came from Ciaran’s video or Louise’s video. Twitter’s Ad reporting would give us the detail on the different variations, but our two bit.ly links could give us an overall picture of which video was getting the most clicks. Using Bit.ly for measuring aggregate performance like this is great. Every bit.ly link you create has analytics attached so we could very simply compare how many clicks each bit.ly link got to see how things were progressing.
On Monday the final bit.ly links results looked like this.
Louise’s video was the clear winner by a long shot. It was aided by the fact that we turned many of Ciaran’s variants off when they failed to perform to allow more budget for the combinations that were driving performance. Ciaran was the one pulling the switches over the weekend so you can be sure he gave them the best possible chance of working- He’s quite competitive like that which makes the final outcome somewhat more amusing! You might question the sense in doing this, but our experience is that if you leave too many variants in a campaign with small daily budgets the poor performance of some ads can drag winning ad combinations down. Our advice is to kill off loosing variations quickly if they stall so you don’t loose momentum and stall the whole campaign.
At the end of this process, we had tested out 10 different combinations, had spent under £50 and had a clear winning combination. With all variants, Louise’s video performed better. Louise’s video paired with either of the following messages gave us the magic greater then 0.8% engagement rate we were looking for.
Failing to engage your digital leads? You need this.
Marketing Emails That Get An 85% Open Rate!? How’s That Done? Try this..
The end result was as follows. We have highlighted the grouped pairs so you can see the performance results we got from each video / text combination.
In terms of the final video results, following the weekend experiment we let our winning combinations run for 10 days or so at the end of this period.
Louise’s video achieved 1638 clicks vs Ciarans Video which achieved… drum roll please…. 91 clicks. The result was one video getting 17 times more clicks than the other. Now obviously the winning video was given a lot more budget and a lot more impressions to perform so we couldn’t actually claim it was 17 times better. However it was the stronger asset so was worthy of a longer more significant boost.
As tests go, this whole thing wasn’t terribly scientific. It wasn’t meant to be. What we set out to do was to see how different videos affected the outcome. What we learned was that it’s not as simple as that. Text messaging around the video still has a big effect. However, clearly in all the tests, Louise’s video outperformed Ciaran’s.
We hope this write up has inspired you to start testing out a few of your own creative combinations. And if you do we would love to hear how it went and what you were able to learn. Don’t get stuck in the rut of just pushing out one variation before you move on. There is so much learning and fun to be had by running a few tests like this, and as you can see it doesn’t even cost much to do it. Just a bit more time to create your assets and set up the tracking so you can monitor performance and time to push the winning combo’s or create variants of them to really fine-tune your message. As you discover more winning phrases or CTA’s you could even try perhaps applying the learning to email subject lines A/B tests.
It’s all about allowing yourself the creative freedom to experiment with variations and then allowing your audience to decide what good looks like. Every time you do this you will learn what works and get that little bit better.