Recently we have been experimenting with how to drive better engagement with our social media marketing. As well as experimenting with multiple text posts on social platforms we have also been seeing what difference the visual assets achieve.
Video Vs Static Images In Social Posts
We have already identified that video, in general, outperforms static images in terms of the levels of engagement it attracts. Not so many people are using video, and it moves, so it makes sense it gets more attention. We wanted to know how much different videos could potentially impact the same message, so we set out to test exactly that.
To test this out we opted to use Twitter Ads. Like many social media platforms, Twitter allows us to target an ideal audience and to test out how much engagement we get from that audience with different variations of messaging and media. We are going to share with you the method we are using and the figures and data for our Twitter experiment drove. Just bear in mind that a similar approach could be taken on any social advertising platform.
Campaign structure for testing social visual assets
The structure of such an experiment is quite simple. Create a campaign on a social platform that targets a large enough audience likely to be interested in your content. Most platforms give you a rough indication of a suitable sized audience. We chose a mixture of signals to target people interested in Marketing and who spoke English. As our objective was to drive up engagement with a recent podcast episode we chose the campaign objective of Website Clicks/ Traffic. We opened the geographic targeting of the campaign to a global audience to ensure we had a large enough pool of people to pull our results from in the short time we were going to be running the campaign.
With this campaign running at a modest £12 a day, we created multiple tweets all designed to promote a single episode of the Digital Marketing Podcast.
Video Assets
We started off by creating 2 short video promos designed to grab attention on social media for the podcast episode. To make the videos we used Lumen5 which allows us to rapidly match a short script with a huge library of royalty free videos. Our Marketing Director Ciaran created one video while showing Louise our Marketing Manager how to use Lumen 5. Louise then attempted to create a better video using Lumen 5 so we could put each video head to head with the same text messaging.
Here is Ciaran’s Video
And here is Louise’s
Our hunch was that Louise’s video was stronger. It has a more attention grabbing opening clip and focusses more on human faces which we know tends to work well for gaining engagement on social media. However, hunches can be wrong so we wanted to test this hunch out and see how our targeted audience would react, and also test out some messaging while we were at it.
Why use Twitter Video Cards instead of uploading standard video?
We opted to focus our experiment on Twitter Video cards, as previous multiple experiments have shown that we tend to get better engagement from video presented in a Twitter Video card as opposed to Video just uploaded directly into the tweet.
Just a few of the video cards we have been experimenting with.
Using the Promoted Tweets interface we initially crafted 2 different tweets to test out against our 2 videos. Matching each tweet text variation with each video we started out this campaign with 4 different variants of our message. All of the promotions were paid promotion only meaning they wouldn’t be posted on our organic Twitter feed. This is quite important when you are testing. You don’t want to bombard your followers with multiple variations of the same videos. Promoted only Tweets allow you the freedom to test without annoying or upsetting your followers.
Fine Tuning Our Variants
So initially we created two text variants with Ciaran’s video and two with Louise’s Video. Each tweet variation was created as an Ad within our campaign and left running. As the campaign focus was website traffic our focus when measuring performance was on the click through rate. We needed to get enough impressions to show how well each message performed. It’s a rather arbitrary figure but we focussed on attempting to get 1000 or more impressions on each variant before we reviewed its performance.
The challenge as the experiment commenced was finding a text and video combination that was actually capable of reaching our arbitrary target of 1000 views in the short time we ran the test. If Twitter Ads doesn’t see strong levels of engagement then it slows down the rate at which your ad is shown. Within 6 hours our initial attempts at messaging were struggling to achieve this as the engagement from them was poor. Generally, we have found you want to be getting close to 0.8%- 1% or better engagement in order for Twitter’s algorithm to keep its interest in what you are promoting.
Here were our initial text variants
Discover the secrets of successful personalised video that boosts Marketing performance. @target_insights talks to @bonjoroapp on how its done and how to ace it. #videomarketing
Want to Boost Your Customer Engagement? @target_insights talks to @bonjoroapp on how to use personal video for better outreach. Learn how it’s done and how to ace it. Listen here. #videomarketing
These two variants didn’t perform well and struggled to break the .5% engagement barrier. On reflection they kind of make the assumption the audience who see them know who @Target_Insights or @bonjoroApp is. Although we all like to imagine our brands are truly universal and omni present in everyones mind, the hard reality is that this isn’t the case. Our first variants were written like we might write a tweet for our own followers. That just didn’t gel with our global target of marketers, and maybe it doesn’t really gel with our own followers if our recent engagement with out organic Tweets is anything to go by!
“If at first you don’t succeed, then try try again.”
So we crafted some shorter more targeted messaging to go with our videos and added two more variants into the mix. It’s a cliché we know but “If at first you don’t succeed, then try try again.” A lot of marketing is like this. Everything we do is our best guess and it won’t always work even if it’s data-led. So you have to iterate and learn if you want results. So that’s exactly what we did. The one thing that was kept constant was that each new variant had a version for each of the videos we were testing. The following 2 messages were pushed into the mix.
Marketing Emails That Get An 85% Open Rate!? How’s That Done? Try this..
Personalised video done badly can seriously suck! but with the right tools…:-)
The first new variation was inspired by a blog post we saw on Bonjoro’s website highlighting some of the results their customers have reported back. Since the headline result seemed intriguing we went with it, mixing it in with a previously successful phrase we have found worked well for Twitter audiences “ Try this…” If you want to identify a few stock phrases you might like to try out then check out our article on crafting effective headlines for a few ideas.
Rather ironically – ( and you might argue predictably!) The “seriously suck” message …seriously sucked. So much so that Twitter refused to even show it it. Yes. It was so poor that we couldn’t even pay Twitter to show it to people. ( This might have been down to the excessive use of punctuation or the negative sentiment or both!) Seeing it wasn’t going anywhere we paused it late on Friday and created a new variation of the ‘85% open rate variation’ in a more cut down and succinct form.
Failing to engage your digital leads? You need this.
Easier Tracking
Tracking all these different variants in the Twitter Ads interface was always going to be a bit of a challenge, so to help know which video got the most clicks, we had created two bit.ly links ( one for each video) and marked the final URL up with suitable Analytics UTM tracking code indicating if the link came from Ciaran’s video or Louise’s video. Twitter’s Ad reporting would give us the detail on the different variations, but our two bit.ly links could give us an overall picture of which video was getting the most clicks. Using Bit.ly for measuring aggregate performance like this is great. Every bit.ly link you create has analytics attached so we could very simply compare how many clicks each bit.ly link got to see how things were progressing.
On Monday the final bit.ly links results looked like this.
Louise’s video was the clear winner by a long shot. It was aided by the fact that we turned many of Ciaran’s variants off when they failed to perform to allow more budget for the combinations that were driving performance. Ciaran was the one pulling the switches over the weekend so you can be sure he gave them the best possible chance of working- He’s quite competitive like that which makes the final outcome somewhat more amusing! You might question the sense in doing this, but our experience is that if you leave too many variants in a campaign with small daily budgets the poor performance of some ads can drag winning ad combinations down. Our advice is to kill off loosing variations quickly if they stall so you don’t loose momentum and stall the whole campaign.
At the end of this process, we had tested out 10 different combinations, had spent under £50 and had a clear winning combination. With all variants, Louise’s video performed better. Louise’s video paired with either of the following messages gave us the magic greater then 0.8% engagement rate we were looking for.
Failing to engage your digital leads? You need this.
And
Marketing Emails That Get An 85% Open Rate!? How’s That Done? Try this..
The end result was as follows. We have highlighted the grouped pairs so you can see the performance results we got from each video / text combination.
Click on image above to see the detail
In terms of the final video results, following the weekend experiment we let our winning combinations run for 10 days or so at the end of this period.
Louise’s video achieved 1638 clicks vs Ciarans Video which achieved… drum roll please…. 91 clicks. The result was one video getting 17 times more clicks than the other. Now obviously the winning video was given a lot more budget and a lot more impressions to perform so we couldn’t actually claim it was 17 times better. However it was the stronger asset so was worthy of a longer more significant boost.
Conclusions
As tests go, this whole thing wasn’t terribly scientific. It wasn’t meant to be. What we set out to do was to see how different videos affected the outcome. What we learned was that it’s not as simple as that. Text messaging around the video still has a big effect. However, clearly in all the tests, Louise’s video outperformed Ciaran’s.
We hope this write up has inspired you to start testing out a few of your own creative combinations. And if you do we would love to hear how it went and what you were able to learn. Don’t get stuck in the rut of just pushing out one variation before you move on. There is so much learning and fun to be had by running a few tests like this, and as you can see it doesn’t even cost much to do it. Just a bit more time to create your assets and set up the tracking so you can monitor performance and time to push the winning combo’s or create variants of them to really fine-tune your message. As you discover more winning phrases or CTA’s you could even try perhaps applying the learning to email subject lines A/B tests.
It’s all about allowing yourself the creative freedom to experiment with variations and then allowing your audience to decide what good looks like. Every time you do this you will learn what works and get that little bit better.