Creative Testing in Launching and Scaling Mobile Apps

Item not bookmarked
Resource bookmarked
Bookmarking...
⛏️
Guest Miner:
Sylvain Gauchet
Review star
Review star
Review star
Review star
Review star
💎  x
14

Lucia Mrvova (Head of UA at AppAgent) talks in details about creative testing: testing concepts or iterations, the process and cadence you need, the best way to structure your tests and more.

Source:
Creative Testing in Launching and Scaling Mobile Apps
(no direct link to watch/listen)
(direct link to watch/listen)
Type:
Webinar
Publication date:
April 7, 2021
Added to the Vault on:
April 28, 2021
Invite a guest
These insights were shared through the free Growth Gems newsletter.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
💎 #
1

With automation taking over, the two biggest levers a UA manager can pull today are optimization events and creatives.

36:20
💎 #
2

Even for a single concept, changing 1 variable (e.g. the character shown) can result in a very different performance (CTR, CVR, etc.). Always look at the full funnel.

38:00
💎 #
3

If you test on Facebook, isolate creative testing on a single placement (e.g. feed on Facebook) in order to measure both ads over a consistent environment.

42:10
💎 #
4

Appoint a main person responsible in each team: one from the UA team and one from the creative team, to own the testing process together (see process here).

43:30
💎 #
5

Focus on proving a concept first, before moving into extensive iterations. But do test 2-3 significant iterations in order to avoid missing out on a potential winner due to a “wrong” element.

44:45
💎 #
6

Think about big changes when you’re working on iterations (e.g. real vs. animated, different characters, B&W vs. color, etc.).

45:27
💎 #
7

If you’re spending anywhere between $100k and $500k/month, a good cadence to start with is 5 concepts/week with about 2-3 major iterations each. However the most important is that you keep the cadence, even if the frequency is lower.

45:00
💎 #
8

It’s important to identify the key variables that drive performance. This can also allow you to use automation to create many versions by changing the key variables.

46:44
💎 #
9

If optimizing for app installs, with around 300 installs per creative tested you should reach 90% statistical significance. However if you have a lot of impressions not leading to any installs, use your common sense and stop the test.

48:23
💎 #
10

Create multivariate tests manually to force equal delivery through separate ad sets. You will avoid wasting your budget with Facebook’s built-in A/B test.

49:45
💎 #
11

To save budget, use 2-layered testing: retest the winners from your MAI optimized campaigns on AEO/VO for ROAS.

50:55
💎 #
12

Be careful: rejections of your creatives are “remembered” at the account level and can then influence your place in auctions.

59:00
💎 #
13

As soon as you have a creative concept that is good enough to be seen, launch it. Do not spend too much time on polishing things too much until you give it a try.

1:23:38
💎 #
14

Recycle creatives as much as you can (e.g. by using TikTok winners on Facebook or Google). It won’t always work but you won’t know for sure until you try. Just resize to adapt to the right size.

1:26:50
The gems from this resource are only available to premium members.
💎 #
1

With automation taking over, the two biggest levers a UA manager can pull today are optimization events and creatives.

36:20
💎 #
2

Even for a single concept, changing 1 variable (e.g. the character shown) can result in a very different performance (CTR, CVR, etc.). Always look at the full funnel.

38:00
💎 #
3

If you test on Facebook, isolate creative testing on a single placement (e.g. feed on Facebook) in order to measure both ads over a consistent environment.

42:10
💎 #
4

Appoint a main person responsible in each team: one from the UA team and one from the creative team, to own the testing process together (see process here).

43:30
💎 #
5

Focus on proving a concept first, before moving into extensive iterations. But do test 2-3 significant iterations in order to avoid missing out on a potential winner due to a “wrong” element.

44:45
💎 #
6

Think about big changes when you’re working on iterations (e.g. real vs. animated, different characters, B&W vs. color, etc.).

45:27
💎 #
7

If you’re spending anywhere between $100k and $500k/month, a good cadence to start with is 5 concepts/week with about 2-3 major iterations each. However the most important is that you keep the cadence, even if the frequency is lower.

45:00
💎 #
8

It’s important to identify the key variables that drive performance. This can also allow you to use automation to create many versions by changing the key variables.

46:44
💎 #
9

If optimizing for app installs, with around 300 installs per creative tested you should reach 90% statistical significance. However if you have a lot of impressions not leading to any installs, use your common sense and stop the test.

48:23
💎 #
10

Create multivariate tests manually to force equal delivery through separate ad sets. You will avoid wasting your budget with Facebook’s built-in A/B test.

49:45
💎 #
11

To save budget, use 2-layered testing: retest the winners from your MAI optimized campaigns on AEO/VO for ROAS.

50:55
💎 #
12

Be careful: rejections of your creatives are “remembered” at the account level and can then influence your place in auctions.

59:00
💎 #
13

As soon as you have a creative concept that is good enough to be seen, launch it. Do not spend too much time on polishing things too much until you give it a try.

1:23:38
💎 #
14

Recycle creatives as much as you can (e.g. by using TikTok winners on Facebook or Google). It won’t always work but you won’t know for sure until you try. Just resize to adapt to the right size.

1:26:50
The gems from this resource are only available to premium members.

Gems are the key bite-size insights "mined" from a specific mobile marketing resource, like a webinar, a panel or a podcast.
They allow you to save time by grasping the most important information in a couple of minutes, and also each include the timestamp from the source.

💎 #
1

With automation taking over, the two biggest levers a UA manager can pull today are optimization events and creatives.

36:20
💎 #
2

Even for a single concept, changing 1 variable (e.g. the character shown) can result in a very different performance (CTR, CVR, etc.). Always look at the full funnel.

38:00
💎 #
3

If you test on Facebook, isolate creative testing on a single placement (e.g. feed on Facebook) in order to measure both ads over a consistent environment.

42:10
💎 #
4

Appoint a main person responsible in each team: one from the UA team and one from the creative team, to own the testing process together (see process here).

43:30
💎 #
5

Focus on proving a concept first, before moving into extensive iterations. But do test 2-3 significant iterations in order to avoid missing out on a potential winner due to a “wrong” element.

44:45
💎 #
6

Think about big changes when you’re working on iterations (e.g. real vs. animated, different characters, B&W vs. color, etc.).

45:27
💎 #
7

If you’re spending anywhere between $100k and $500k/month, a good cadence to start with is 5 concepts/week with about 2-3 major iterations each. However the most important is that you keep the cadence, even if the frequency is lower.

45:00
💎 #
8

It’s important to identify the key variables that drive performance. This can also allow you to use automation to create many versions by changing the key variables.

46:44
💎 #
9

If optimizing for app installs, with around 300 installs per creative tested you should reach 90% statistical significance. However if you have a lot of impressions not leading to any installs, use your common sense and stop the test.

48:23
💎 #
10

Create multivariate tests manually to force equal delivery through separate ad sets. You will avoid wasting your budget with Facebook’s built-in A/B test.

49:45
💎 #
11

To save budget, use 2-layered testing: retest the winners from your MAI optimized campaigns on AEO/VO for ROAS.

50:55
💎 #
12

Be careful: rejections of your creatives are “remembered” at the account level and can then influence your place in auctions.

59:00
💎 #
13

As soon as you have a creative concept that is good enough to be seen, launch it. Do not spend too much time on polishing things too much until you give it a try.

1:23:38
💎 #
14

Recycle creatives as much as you can (e.g. by using TikTok winners on Facebook or Google). It won’t always work but you won’t know for sure until you try. Just resize to adapt to the right size.

1:26:50

Notes for this resource are currently being transferred and will be available soon.

The importance of creative testing

[💎@36:20] With automation taking over, the two biggest levers a UA manager can pull today are optimization events and creatives.


[💎@38:00] Even for a single concept, changing 1 variable (e.g. the character shown) can result in a very different performance (CTR, CVR, etc.). Always look at the full funnel.

UA managers have to deal with the “UA paradox”: the best CTR has the worst CVR while the worst CTR has the best CVR.


Where to test?

[💎@42:10] If you test on Facebook, isolate creative testing on a single placement (e.g. feed on Facebook) in order to measure both ads over a consistent environment.


How to test?

[💎@43:30] Appoint a main person responsible in each team: one from the UA team and one from the creative team to own the testing process together (see process below).

[💎@44:45] Focus on proving a concept first before moving into extensive iterations. But do test 2-3 significant iterations in order to avoid missing out on a potential winner due to a “wrong” element.

[💎@45:27] Think about big changes when you’re working on iterations (e.g. real vs. animated, different characters, B&W vs. color, etc.).

[💎@45:00] If you’re spending anywhere between $100k and $500k/month, a good cadence to start with is 5 concepts/week with about 2-3 major iterations each. However the most important is that you keep the cadence, even if the frequency is lower.


[💎@46:44] It’s important to identify the key variables that drive performance. This can also allow you to use automation to create many versions by changing the key variables.

[💎@48:23] If optimizing for app installs, with around 300 installs per creative tested you should reach 90% statistical significance. However if you have a lot of impressions not leading to any installs, use your common sense and stop the test. 

[💎@49:45] Create multivariate tests manually to force equal delivery through separate ad sets. You will avoid wasting your budget with Facebook’s built-in A/B test.

[💎@50:55] To save budget, use 2-layered testing: retest the winners from your MAI optimized campaigns on AEO/VO for ROAS.



How not to test?

[💎@59:00] Be careful: rejections of your creatives are “remembered” at the account level and can then influence your place in auctions.


Collaboration between UA and Creative teams

Q&A

It can be a good signal when the CTR is somewhat lower if the conversion within the app is good: it means that the creative “filters” for qualified traffic.


How often do you recommend basing your ideas on players’ profiles and motivations?

For games and apps it’s important to nail down your target personas, because different things can be appealing to each one of these segments. But also don’t hesitate to think out of the box.


How to start with TikTok? Can it be viable for small indie companies?

Marijana (Pexels) - For indie companies it’s best to start with Google and Facebook. But you’ll never know until you try.
Snezana (Pexels) - The TikTok platform is very specific, and sometimes the recommendations or best practices do not work.


Experience with misleading creatives (fake ads)?

Snezana - They haven’t done any fake ads, because they do not want to deceive users. They also haven’t needed users in that way nor that kind of volume.

Lucia - you can simplify the UI for the ad but Lucia hasn't found value in doing something completely unrelated because users might install the app but then will leave.


Duration of creative testing?

Lucia - run at least for a full week to account for seasonality, but sometimes it’s so bad or so good that you can tell. Reaching statistical significance also depends on your budget and what you’re optimizing for.

Snezana - on Facebook they run with lowest cost per install with no cap strategy so you get faster results.


Have you tested videos with and without sound?

Snezana - on TikTok, users don’t mute so the sound is super important. On other platforms they try to not rely too much on the sound because the sound is on mute by default.

Lucia - add subtitles/captions for Facebook.


How to get insights about the audience before you start testing?

Lucia - you usually have some assumptions on who the game is for and what they like. Then you can do focus groups and talk to people to better understand. Then there are also marketing intelligence tools or in-app surveys.

Snezana - create user persona for their new games. Once the game is out and live they rely a lot on user feedback: based on user reviews on the Google Play Store or the App Store you get to understand what they like and don’t. 

To gather insights about your audience, create user personas before you launch the game. Then once the game is out you can use app store reviews to better understand what users like and don’t like.


Is it worth to spend more time on the creative/graphic process to polish things or to test as soon as possible?

Snezana - go as soon as possible.

[💎@1:23:38] As soon as you have a creative concept that is good enough to be seen, launch it. Do not spend too much time on polishing things too much until you give it a try.


What are the reasons for a long learning phase for some creatives and how to avoid it?

Marijana - when you test a new concept, do not limit the algorithm. Use lowest cost per install and an appropriate budget.

Snezana - on UAC, if you see an ad group that works really well then move your new creative to that group (even if it’s not the same theme) in order to shorten the learning phase.


Translating winners from TikTok to other platforms?

Snezana

[💎@1:26:50] Recycle creatives as much as you can (e.g. by using TikTok winners on Facebook or Google). It won’t always work but you won’t know for sure until you try. Just resize to adapt to the right size.

Lucia - tends to test mostly on Facebook and applies winners on Google and Snapchat.


Testing creatives with the iOS 14.5 update?

Snezana - mostly test on Android, iOS is not a big channel

Lucia - big changes coming and testing will look different. If you rely on SKAdNetwork only you might need to have 1 concept for 1 campaign, or at least a separate campaign where you test creatives.


“Iterate and test until you die”


Re-testing old creatives

Snezana - if a creative doesn’t work now, it might work later.

Lucia

Whenever a creative (and its potential 2-3 iterations) doesn’t work, leave it for 3-6 months and try it again (potentially on a different audience) if you’re running out of ideas. 


Approach to test concepts - FB split test?

Lucia - do not rely on FB A/B testing. Do it manually to make sure you have equal delivery.



The notes from this resource are only available to premium members.

The importance of creative testing

[💎@36:20] With automation taking over, the two biggest levers a UA manager can pull today are optimization events and creatives.


[💎@38:00] Even for a single concept, changing 1 variable (e.g. the character shown) can result in a very different performance (CTR, CVR, etc.). Always look at the full funnel.

UA managers have to deal with the “UA paradox”: the best CTR has the worst CVR while the worst CTR has the best CVR.


Where to test?

[💎@42:10] If you test on Facebook, isolate creative testing on a single placement (e.g. feed on Facebook) in order to measure both ads over a consistent environment.


How to test?

[💎@43:30] Appoint a main person responsible in each team: one from the UA team and one from the creative team to own the testing process together (see process below).

[💎@44:45] Focus on proving a concept first before moving into extensive iterations. But do test 2-3 significant iterations in order to avoid missing out on a potential winner due to a “wrong” element.

[💎@45:27] Think about big changes when you’re working on iterations (e.g. real vs. animated, different characters, B&W vs. color, etc.).

[💎@45:00] If you’re spending anywhere between $100k and $500k/month, a good cadence to start with is 5 concepts/week with about 2-3 major iterations each. However the most important is that you keep the cadence, even if the frequency is lower.


[💎@46:44] It’s important to identify the key variables that drive performance. This can also allow you to use automation to create many versions by changing the key variables.

[💎@48:23] If optimizing for app installs, with around 300 installs per creative tested you should reach 90% statistical significance. However if you have a lot of impressions not leading to any installs, use your common sense and stop the test. 

[💎@49:45] Create multivariate tests manually to force equal delivery through separate ad sets. You will avoid wasting your budget with Facebook’s built-in A/B test.

[💎@50:55] To save budget, use 2-layered testing: retest the winners from your MAI optimized campaigns on AEO/VO for ROAS.



How not to test?

[💎@59:00] Be careful: rejections of your creatives are “remembered” at the account level and can then influence your place in auctions.


Collaboration between UA and Creative teams

Q&A

It can be a good signal when the CTR is somewhat lower if the conversion within the app is good: it means that the creative “filters” for qualified traffic.


How often do you recommend basing your ideas on players’ profiles and motivations?

For games and apps it’s important to nail down your target personas, because different things can be appealing to each one of these segments. But also don’t hesitate to think out of the box.


How to start with TikTok? Can it be viable for small indie companies?

Marijana (Pexels) - For indie companies it’s best to start with Google and Facebook. But you’ll never know until you try.
Snezana (Pexels) - The TikTok platform is very specific, and sometimes the recommendations or best practices do not work.


Experience with misleading creatives (fake ads)?

Snezana - They haven’t done any fake ads, because they do not want to deceive users. They also haven’t needed users in that way nor that kind of volume.

Lucia - you can simplify the UI for the ad but Lucia hasn't found value in doing something completely unrelated because users might install the app but then will leave.


Duration of creative testing?

Lucia - run at least for a full week to account for seasonality, but sometimes it’s so bad or so good that you can tell. Reaching statistical significance also depends on your budget and what you’re optimizing for.

Snezana - on Facebook they run with lowest cost per install with no cap strategy so you get faster results.


Have you tested videos with and without sound?

Snezana - on TikTok, users don’t mute so the sound is super important. On other platforms they try to not rely too much on the sound because the sound is on mute by default.

Lucia - add subtitles/captions for Facebook.


How to get insights about the audience before you start testing?

Lucia - you usually have some assumptions on who the game is for and what they like. Then you can do focus groups and talk to people to better understand. Then there are also marketing intelligence tools or in-app surveys.

Snezana - create user persona for their new games. Once the game is out and live they rely a lot on user feedback: based on user reviews on the Google Play Store or the App Store you get to understand what they like and don’t. 

To gather insights about your audience, create user personas before you launch the game. Then once the game is out you can use app store reviews to better understand what users like and don’t like.


Is it worth to spend more time on the creative/graphic process to polish things or to test as soon as possible?

Snezana - go as soon as possible.

[💎@1:23:38] As soon as you have a creative concept that is good enough to be seen, launch it. Do not spend too much time on polishing things too much until you give it a try.


What are the reasons for a long learning phase for some creatives and how to avoid it?

Marijana - when you test a new concept, do not limit the algorithm. Use lowest cost per install and an appropriate budget.

Snezana - on UAC, if you see an ad group that works really well then move your new creative to that group (even if it’s not the same theme) in order to shorten the learning phase.


Translating winners from TikTok to other platforms?

Snezana

[💎@1:26:50] Recycle creatives as much as you can (e.g. by using TikTok winners on Facebook or Google). It won’t always work but you won’t know for sure until you try. Just resize to adapt to the right size.

Lucia - tends to test mostly on Facebook and applies winners on Google and Snapchat.


Testing creatives with the iOS 14.5 update?

Snezana - mostly test on Android, iOS is not a big channel

Lucia - big changes coming and testing will look different. If you rely on SKAdNetwork only you might need to have 1 concept for 1 campaign, or at least a separate campaign where you test creatives.


“Iterate and test until you die”


Re-testing old creatives

Snezana - if a creative doesn’t work now, it might work later.

Lucia

Whenever a creative (and its potential 2-3 iterations) doesn’t work, leave it for 3-6 months and try it again (potentially on a different audience) if you’re running out of ideas. 


Approach to test concepts - FB split test?

Lucia - do not rely on FB A/B testing. Do it manually to make sure you have equal delivery.



The notes from this resource are only available to premium members.

The importance of creative testing

[💎@36:20] With automation taking over, the two biggest levers a UA manager can pull today are optimization events and creatives.


[💎@38:00] Even for a single concept, changing 1 variable (e.g. the character shown) can result in a very different performance (CTR, CVR, etc.). Always look at the full funnel.

UA managers have to deal with the “UA paradox”: the best CTR has the worst CVR while the worst CTR has the best CVR.


Where to test?

[💎@42:10] If you test on Facebook, isolate creative testing on a single placement (e.g. feed on Facebook) in order to measure both ads over a consistent environment.


How to test?

[💎@43:30] Appoint a main person responsible in each team: one from the UA team and one from the creative team to own the testing process together (see process below).

[💎@44:45] Focus on proving a concept first before moving into extensive iterations. But do test 2-3 significant iterations in order to avoid missing out on a potential winner due to a “wrong” element.

[💎@45:27] Think about big changes when you’re working on iterations (e.g. real vs. animated, different characters, B&W vs. color, etc.).

[💎@45:00] If you’re spending anywhere between $100k and $500k/month, a good cadence to start with is 5 concepts/week with about 2-3 major iterations each. However the most important is that you keep the cadence, even if the frequency is lower.


[💎@46:44] It’s important to identify the key variables that drive performance. This can also allow you to use automation to create many versions by changing the key variables.

[💎@48:23] If optimizing for app installs, with around 300 installs per creative tested you should reach 90% statistical significance. However if you have a lot of impressions not leading to any installs, use your common sense and stop the test. 

[💎@49:45] Create multivariate tests manually to force equal delivery through separate ad sets. You will avoid wasting your budget with Facebook’s built-in A/B test.

[💎@50:55] To save budget, use 2-layered testing: retest the winners from your MAI optimized campaigns on AEO/VO for ROAS.



How not to test?

[💎@59:00] Be careful: rejections of your creatives are “remembered” at the account level and can then influence your place in auctions.


Collaboration between UA and Creative teams

Q&A

It can be a good signal when the CTR is somewhat lower if the conversion within the app is good: it means that the creative “filters” for qualified traffic.


How often do you recommend basing your ideas on players’ profiles and motivations?

For games and apps it’s important to nail down your target personas, because different things can be appealing to each one of these segments. But also don’t hesitate to think out of the box.


How to start with TikTok? Can it be viable for small indie companies?

Marijana (Pexels) - For indie companies it’s best to start with Google and Facebook. But you’ll never know until you try.
Snezana (Pexels) - The TikTok platform is very specific, and sometimes the recommendations or best practices do not work.


Experience with misleading creatives (fake ads)?

Snezana - They haven’t done any fake ads, because they do not want to deceive users. They also haven’t needed users in that way nor that kind of volume.

Lucia - you can simplify the UI for the ad but Lucia hasn't found value in doing something completely unrelated because users might install the app but then will leave.


Duration of creative testing?

Lucia - run at least for a full week to account for seasonality, but sometimes it’s so bad or so good that you can tell. Reaching statistical significance also depends on your budget and what you’re optimizing for.

Snezana - on Facebook they run with lowest cost per install with no cap strategy so you get faster results.


Have you tested videos with and without sound?

Snezana - on TikTok, users don’t mute so the sound is super important. On other platforms they try to not rely too much on the sound because the sound is on mute by default.

Lucia - add subtitles/captions for Facebook.


How to get insights about the audience before you start testing?

Lucia - you usually have some assumptions on who the game is for and what they like. Then you can do focus groups and talk to people to better understand. Then there are also marketing intelligence tools or in-app surveys.

Snezana - create user persona for their new games. Once the game is out and live they rely a lot on user feedback: based on user reviews on the Google Play Store or the App Store you get to understand what they like and don’t. 

To gather insights about your audience, create user personas before you launch the game. Then once the game is out you can use app store reviews to better understand what users like and don’t like.


Is it worth to spend more time on the creative/graphic process to polish things or to test as soon as possible?

Snezana - go as soon as possible.

[💎@1:23:38] As soon as you have a creative concept that is good enough to be seen, launch it. Do not spend too much time on polishing things too much until you give it a try.


What are the reasons for a long learning phase for some creatives and how to avoid it?

Marijana - when you test a new concept, do not limit the algorithm. Use lowest cost per install and an appropriate budget.

Snezana - on UAC, if you see an ad group that works really well then move your new creative to that group (even if it’s not the same theme) in order to shorten the learning phase.


Translating winners from TikTok to other platforms?

Snezana

[💎@1:26:50] Recycle creatives as much as you can (e.g. by using TikTok winners on Facebook or Google). It won’t always work but you won’t know for sure until you try. Just resize to adapt to the right size.

Lucia - tends to test mostly on Facebook and applies winners on Google and Snapchat.


Testing creatives with the iOS 14.5 update?

Snezana - mostly test on Android, iOS is not a big channel

Lucia - big changes coming and testing will look different. If you rely on SKAdNetwork only you might need to have 1 concept for 1 campaign, or at least a separate campaign where you test creatives.


“Iterate and test until you die”


Re-testing old creatives

Snezana - if a creative doesn’t work now, it might work later.

Lucia

Whenever a creative (and its potential 2-3 iterations) doesn’t work, leave it for 3-6 months and try it again (potentially on a different audience) if you’re running out of ideas. 


Approach to test concepts - FB split test?

Lucia - do not rely on FB A/B testing. Do it manually to make sure you have equal delivery.