This is part nine of the Mobile Marketing Creatives Series. In ten episodes, we aim to provide insight and inspiration on creating thumb-stopping visuals to promote your app.
Download the Mobile Ad Creatives eBook today to read the other nine episodes in the series. The comprehensive guide includes ten core topics condensed into a practical blueprint with examples from AppAgent’s Creative Studio.
What you will learn in this episode:
– Why creative performance is tested on Facebook
– Which metrics to follow and what are the hidden pitfalls
– Whether Facebook Ads dashboard is better than MMP dashboard
Using Facebook to test mobile ads
For most publishers, both gaming and non-gaming, the main channel used for creatives testing is Facebook. It provides detailed information about individual ad creative performance. You’ll be able to see in which placements the creative performed well, and how engaging the creative is. However, this data isn’t available for Automated App Ads (AAA) campaigns and iOS since the IDFA deprecation (more on that below).
In addition, Facebook has several placements that can provide insights for further creative development. For example, if you see a particular ad performed well in Reels, you should definitely use the same ad on TikTok (If you haven’t seen it before, Facebook Reels is a TikTok copycat.) Placement testing is the first thing we do at AppAgent. Our designers use the results to decide the creative format they should focus on. To learn more, I highly recommend reading Facebook’s post on how people consume mobile video ads in different placements.
Google Ads, the second biggest user acquisition channel on mobile, doesn’t enable you to control and evaluate the performance of individual creatives and placements. Unfortunately, insights from Facebook aren’t transferrable to other platforms, so it’s not suitable for creatives testing.
Companies with successful ads on TikTok should develop a specific creative strategy for this particular channel. But, the recent introduction of Facebook Reels may mean this is unnecessary if tests establish the creative performance matches on TikTok and Reels.
Another option large publishers use to test creatives is programmatic media buying. But that’s more of a limited case due to technical complexity and the larger budgets needed for this type of user acquisition strategy.
How to test mobile user acquisition (UA) creative
We’ve summed up all our knowledge into an ebook on Mobile ad creatives testing so for the purpose of this post we will only cover the fundamentals.
The creative testing process should first focus on identifying the best placement. That is achieved through developing a universal ad creative concept that could be adjusted for placements on Facebook Feed, Audience network, Stories and Reel.
The creative testing Key Performance Indicator (KPI) of every User Acquisition manager is hidden behind the abbreviation IPM. It stands for installs per mille, or installations per one thousand impressions. (Mille is Latin for one thousand.) It’s a ratio of how many installs you get from each creative exposed per one thousand views.
Why is IPM the best metric for creative evaluation? It combines the click through rate (CTR) of an ad with install rate (IR) in app stores. Well, not 100%, as Thomas Petit explains: “Looking at both CTR & IR is definitely insightful about how the messaging is perceived. But those numbers aren’t telling the whole story of creative performance. Be aware that with some of the installs coming through view-through attribution, a click may not be happening between an impression and an install: IPM is NOT exactly equal to CTR*IR!”
In most cases, these two conversion metrics work together – high CTR leads to lower IR and vice versa. Usually, the more attractive the ad is, and the more clicks it attracts, the less relevant the audience and the bigger the disconnection with the product presentation in the App Store or Play Store happens.
As a UA manager, you are constantly challenged to find a balance between the attractiveness of ads and high conversion in the store, as well as user retention after a download.
Speaking of retention, besides IPM, there is another metric that every UA manager should monitor. It’s DX retention. DX retention represents whether the quality of the traffic coming from a particular creative is good. However, deeper funnel metrics – including first payments – are not collected quicky enough to take immediate action when running the first creative tests. Unless you have a subscription app with a high free to trial ratio or an ecommerce app.
How to identify the best performing creative?
AppAgent’s preferred creative concept testing methodology involves putting one creative into an ad set and launching 4 to 5 ad sets (new creatives) simultaneously.
This setup forces Facebook to test new creatives equally, without any “bias”. Numerous tests have proved that if a top-performing creative is mixed with new creatives, it always beats the contenders (new creatives).
If contenders are put in one ad set, the Facebook algorithm quickly starts preferring one of them and there’s not enough traffic put behind other creatives to reach statistically significant levels.
On the other hand, you can alternate that with another strategy used by Thomas Petit: “As soon as I get a winner among the new, I move it out and get data on the rest.”
The initial goal of the creative testing is to hit 100 installs and 10K+ impressions per creative to be able to evaluate the first data (this includes means performance data of each new creative, impressions, CPM, CTR, install rate, cost per install).
This testing is usually done on a broad audience to see if Facebook is able to find the right segment.
Primarily, we are testing on one main market – usually the US – but geographic targeting is often app- or game-specific. For example, during the soft launch, priorities might be different and testing on the main market only begins at the later stages.
IDFA impact on mobile ad creatives testing
Privacy changes on the iOS platform have changed the game for user acquisition managers. There is much less data available these days, as Thomas Petit explains: “Whatever you see off the ad network – beyond a click – is not actual, but modeled data. Except of datapoints coming from Facebook, such as impression, click and 3-second view, everything else is extrapolated. That’s because Apple passes back only campaign level data.”
Ads testing now happens primarily on Android, where the analysis at a creative level is still possible. Using creative performance results from Android for iOS campaigns is not ideal, but we’ve historically seen that the top-performing UA creatives achieved similar success on both platforms. However, this could be product specific.
To get closer to iOS user profiles, we filter the Android version to 10+ to exclude low-end devices. If your audience pool is larger, you can change the fields (by adding older versions, whitelisting specific top devices or blacklisting poor devices, for example) and show your ads to high-end Android device owners.
Another challenge of testing creatives on iOS is delayed reporting. Postbacks that inform UA managers about specific behavior such as onboarding finished, starter pack bought, or level three reached, are all delayed. Such a delay might be up to 3 days after the postback happened, however, the spend data coming from Facebook is nearly realtime. This makes any campaign and creative evaluation on iOS difficult. It’s another reason why to stick to creative testing on Android.
Creative performance evaluation pitfalls
There are some pitfalls to consider, especially if you are still exploring the best creative testing and evaluation set up for your company:
- Your ad account can negatively affect the performance of new creatives – it might sound odd, but we’ve proven that in some cases it’s better to create a new ad account to reset historical data in case you are not able to beat the control creative for a long time.
- Metrics are not comparable across placements – CTR, install rate, and IPM differ significantly between advertising placements. An ad in Facebook News Feed will have a completely different funnel to an ad shown in Stories, for example. Therefore, I suggest you evaluate creative performance segmented by placement. Once you’ve generated more data, you can evaluate using deeper funnel metrics.
- Creative testing using Cost Per Action (CPA) campaigns – even though we see a great performance with cost per action campaigns, it’s too expensive for creative testing. Hitting the goal of 100 installs and 10K+ impressions is more effective with campaigns optimizing towards install as it also helps Facebook to optimize its algorithm faster. The second round of checks is required to ensure the new winning creative is driving conversions, such as trials started and in-app purchases.
Is Facebook Ads manager better than MMP dashboard?
The first thing to do once you have the first campaign data is to check the consistency between Facebook reporting and your mobile measurement partner (attribution tool). Often you can spot discrepancies that need to be resolved, or at least understood.
Facebook reporting is generally super quick and allows you to define your own metrics such as IPM (which is not a standard performance indicator). Also, Facebook offers more metrics, such as 3-sec per impression rate, ThruView, Video average play time and the percentage of a video watched (which your MMP is not getting through API).
To sum up, Facebook Ads Manager, MMP dashboard as well as your own dashboard have their own pros and cons. You can only decide on the best solution for you individually, or your team, after exploring all three options. None of them is significantly better or worse than the other.
Creative testing is mostly performed on Facebook and Android, while learnings are then applied to the iOS platform.
Initially, your focus should be on testing a universal creative concept across all placements. Only after a dominant placement is identified should you adjust your a creative strategy.
New creatives testing has to be run in separate ad sets to force Facebook to drive enough impressions (10K+) and installs (100) per each variant.
Creatives evaluation has to be done per placement as each type is consumed differently by users, and metrics may differ significantly.
The main metric is impression to mille (IPM), which combines ad click-through rate and store conversion.
Mobile Ad Creatives eBook
How to Design Ads and App Store Creatives
A comprehensive guide to designing thumb-stopping visuals that will grow your user base and revenue.
📕 Learn more about industry insights and best practices by signing up for our newsletter here.
🤝 Get help with growth strategy, app marketing, user acquisition and video ads production by reaching us at hi at appagent.com.
HOW TO READ MOBILE AD CREATIVE PERFORMANCE DATA?
When creative testing, you should focus your attention on IPM (installs per mille)—an evaluation metric that combines the click through rate (CTR) of an ad with the install rate (IR) in app stores.
Evaluating DX retention can also help you understand whether the quality of the traffic coming from a particular creative is good.